00:00:00.001 Started by upstream project "autotest-per-patch" build number 126149 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.010 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.010 The recommended git tool is: git 00:00:00.011 using credential 00000000-0000-0000-0000-000000000002 00:00:00.012 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.029 Fetching changes from the remote Git repository 00:00:00.033 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.048 Using shallow fetch with depth 1 00:00:00.048 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.048 > git --version # timeout=10 00:00:00.064 > git --version # 'git version 2.39.2' 00:00:00.064 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.095 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.095 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.222 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.230 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.241 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.241 > git config core.sparsecheckout # timeout=10 00:00:02.252 > git read-tree -mu HEAD # timeout=10 00:00:02.268 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.286 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.286 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:02.381 [Pipeline] Start of Pipeline 00:00:02.397 [Pipeline] library 00:00:02.399 Loading library shm_lib@master 00:00:02.399 Library shm_lib@master is cached. Copying from home. 00:00:02.419 [Pipeline] node 00:00:02.430 Running on CYP6 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.432 [Pipeline] { 00:00:02.445 [Pipeline] catchError 00:00:02.446 [Pipeline] { 00:00:02.461 [Pipeline] wrap 00:00:02.472 [Pipeline] { 00:00:02.481 [Pipeline] stage 00:00:02.483 [Pipeline] { (Prologue) 00:00:02.680 [Pipeline] sh 00:00:02.995 + logger -p user.info -t JENKINS-CI 00:00:03.019 [Pipeline] echo 00:00:03.021 Node: CYP6 00:00:03.030 [Pipeline] sh 00:00:03.352 [Pipeline] setCustomBuildProperty 00:00:03.366 [Pipeline] echo 00:00:03.368 Cleanup processes 00:00:03.374 [Pipeline] sh 00:00:03.663 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.663 1405811 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.680 [Pipeline] sh 00:00:03.972 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.972 ++ grep -v 'sudo pgrep' 00:00:03.972 ++ awk '{print $1}' 00:00:03.972 + sudo kill -9 00:00:03.972 + true 00:00:03.987 [Pipeline] cleanWs 00:00:03.997 [WS-CLEANUP] Deleting project workspace... 00:00:03.997 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.004 [WS-CLEANUP] done 00:00:04.008 [Pipeline] setCustomBuildProperty 00:00:04.025 [Pipeline] sh 00:00:04.309 + sudo git config --global --replace-all safe.directory '*' 00:00:04.388 [Pipeline] httpRequest 00:00:04.497 [Pipeline] echo 00:00:04.498 Sorcerer 10.211.164.101 is alive 00:00:04.504 [Pipeline] httpRequest 00:00:04.510 HttpMethod: GET 00:00:04.510 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.511 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.514 Response Code: HTTP/1.1 200 OK 00:00:04.515 Success: Status code 200 is in the accepted range: 200,404 00:00:04.515 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.118 [Pipeline] sh 00:00:05.402 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.416 [Pipeline] httpRequest 00:00:05.431 [Pipeline] echo 00:00:05.433 Sorcerer 10.211.164.101 is alive 00:00:05.440 [Pipeline] httpRequest 00:00:05.445 HttpMethod: GET 00:00:05.446 URL: http://10.211.164.101/packages/spdk_897e912d5ef39b95adea4d69f24b5af81e596e94.tar.gz 00:00:05.446 Sending request to url: http://10.211.164.101/packages/spdk_897e912d5ef39b95adea4d69f24b5af81e596e94.tar.gz 00:00:05.459 Response Code: HTTP/1.1 200 OK 00:00:05.460 Success: Status code 200 is in the accepted range: 200,404 00:00:05.460 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_897e912d5ef39b95adea4d69f24b5af81e596e94.tar.gz 00:00:53.939 [Pipeline] sh 00:00:54.229 + tar --no-same-owner -xf spdk_897e912d5ef39b95adea4d69f24b5af81e596e94.tar.gz 00:00:57.543 [Pipeline] sh 00:00:57.833 + git -C spdk log --oneline -n5 00:00:57.833 897e912d5 lib/ublk: wait and retry before starting USER RECOVERY 00:00:57.833 719d03c6a sock/uring: only register net impl if supported 00:00:57.833 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:00:57.833 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:00:57.833 6c7c1f57e accel: add sequence outstanding stat 00:00:57.848 [Pipeline] } 00:00:57.867 [Pipeline] // stage 00:00:57.877 [Pipeline] stage 00:00:57.880 [Pipeline] { (Prepare) 00:00:57.895 [Pipeline] writeFile 00:00:57.907 [Pipeline] sh 00:00:58.190 + logger -p user.info -t JENKINS-CI 00:00:58.204 [Pipeline] sh 00:00:58.490 + logger -p user.info -t JENKINS-CI 00:00:58.504 [Pipeline] sh 00:00:58.790 + cat autorun-spdk.conf 00:00:58.790 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:58.790 SPDK_TEST_BLOCKDEV=1 00:00:58.790 SPDK_TEST_ISAL=1 00:00:58.790 SPDK_TEST_CRYPTO=1 00:00:58.790 SPDK_TEST_REDUCE=1 00:00:58.790 SPDK_TEST_VBDEV_COMPRESS=1 00:00:58.790 SPDK_RUN_UBSAN=1 00:00:58.798 RUN_NIGHTLY=0 00:00:58.802 [Pipeline] readFile 00:00:58.829 [Pipeline] withEnv 00:00:58.831 [Pipeline] { 00:00:58.846 [Pipeline] sh 00:00:59.133 + set -ex 00:00:59.133 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:59.133 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:59.133 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:59.133 ++ SPDK_TEST_BLOCKDEV=1 00:00:59.133 ++ SPDK_TEST_ISAL=1 00:00:59.133 ++ SPDK_TEST_CRYPTO=1 00:00:59.133 ++ SPDK_TEST_REDUCE=1 00:00:59.133 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:59.133 ++ SPDK_RUN_UBSAN=1 00:00:59.133 ++ RUN_NIGHTLY=0 00:00:59.133 + case $SPDK_TEST_NVMF_NICS in 00:00:59.133 + DRIVERS= 00:00:59.133 + [[ -n '' ]] 00:00:59.133 + exit 0 00:00:59.143 [Pipeline] } 00:00:59.160 [Pipeline] // withEnv 00:00:59.165 [Pipeline] } 00:00:59.183 [Pipeline] // stage 00:00:59.193 [Pipeline] catchError 00:00:59.195 [Pipeline] { 00:00:59.210 [Pipeline] timeout 00:00:59.210 Timeout set to expire in 40 min 00:00:59.212 [Pipeline] { 00:00:59.227 [Pipeline] stage 00:00:59.229 [Pipeline] { (Tests) 00:00:59.244 [Pipeline] sh 00:00:59.530 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:59.530 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:59.530 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:59.530 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:59.530 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:59.530 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:59.530 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:59.530 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:59.530 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:59.530 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:59.530 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:59.530 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:59.530 + source /etc/os-release 00:00:59.530 ++ NAME='Fedora Linux' 00:00:59.530 ++ VERSION='38 (Cloud Edition)' 00:00:59.530 ++ ID=fedora 00:00:59.530 ++ VERSION_ID=38 00:00:59.530 ++ VERSION_CODENAME= 00:00:59.530 ++ PLATFORM_ID=platform:f38 00:00:59.530 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:59.530 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:59.530 ++ LOGO=fedora-logo-icon 00:00:59.530 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:59.530 ++ HOME_URL=https://fedoraproject.org/ 00:00:59.530 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:59.530 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:59.530 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:59.530 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:59.530 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:59.530 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:59.530 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:59.530 ++ SUPPORT_END=2024-05-14 00:00:59.530 ++ VARIANT='Cloud Edition' 00:00:59.530 ++ VARIANT_ID=cloud 00:00:59.530 + uname -a 00:00:59.530 Linux spdk-CYP-06 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:59.530 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:02.832 Hugepages 00:01:02.832 node hugesize free / total 00:01:02.832 node0 1048576kB 0 / 0 00:01:02.832 node0 2048kB 0 / 0 00:01:02.832 node1 1048576kB 0 / 0 00:01:02.832 node1 2048kB 0 / 0 00:01:02.833 00:01:02.833 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:02.833 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:01:02.833 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:01:02.833 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:01:03.093 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:01:03.093 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:01:03.093 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:01:03.093 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:01:03.093 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:01:03.093 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:03.093 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:01:03.093 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:01:03.093 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:01:03.093 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:01:03.093 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:01:03.093 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:01:03.093 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:01:03.093 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:01:03.093 + rm -f /tmp/spdk-ld-path 00:01:03.093 + source autorun-spdk.conf 00:01:03.093 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:03.093 ++ SPDK_TEST_BLOCKDEV=1 00:01:03.093 ++ SPDK_TEST_ISAL=1 00:01:03.093 ++ SPDK_TEST_CRYPTO=1 00:01:03.093 ++ SPDK_TEST_REDUCE=1 00:01:03.093 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:03.093 ++ SPDK_RUN_UBSAN=1 00:01:03.093 ++ RUN_NIGHTLY=0 00:01:03.093 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:03.093 + [[ -n '' ]] 00:01:03.094 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:03.094 + for M in /var/spdk/build-*-manifest.txt 00:01:03.094 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:03.094 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:03.094 + for M in /var/spdk/build-*-manifest.txt 00:01:03.094 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:03.094 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:03.094 ++ uname 00:01:03.094 + [[ Linux == \L\i\n\u\x ]] 00:01:03.094 + sudo dmesg -T 00:01:03.094 + sudo dmesg --clear 00:01:03.356 + dmesg_pid=1407472 00:01:03.356 + [[ Fedora Linux == FreeBSD ]] 00:01:03.356 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:03.356 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:03.356 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:03.356 + [[ -x /usr/src/fio-static/fio ]] 00:01:03.356 + export FIO_BIN=/usr/src/fio-static/fio 00:01:03.356 + FIO_BIN=/usr/src/fio-static/fio 00:01:03.356 + sudo dmesg -Tw 00:01:03.356 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:03.356 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:03.356 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:03.356 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:03.356 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:03.356 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:03.356 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:03.356 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:03.356 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:03.356 Test configuration: 00:01:03.356 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:03.356 SPDK_TEST_BLOCKDEV=1 00:01:03.356 SPDK_TEST_ISAL=1 00:01:03.356 SPDK_TEST_CRYPTO=1 00:01:03.356 SPDK_TEST_REDUCE=1 00:01:03.356 SPDK_TEST_VBDEV_COMPRESS=1 00:01:03.356 SPDK_RUN_UBSAN=1 00:01:03.356 RUN_NIGHTLY=0 07:36:47 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:03.356 07:36:47 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:03.356 07:36:47 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:03.356 07:36:47 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:03.356 07:36:47 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.356 07:36:47 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.356 07:36:47 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.356 07:36:47 -- paths/export.sh@5 -- $ export PATH 00:01:03.356 07:36:47 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.356 07:36:47 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:03.356 07:36:47 -- common/autobuild_common.sh@444 -- $ date +%s 00:01:03.356 07:36:47 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721021807.XXXXXX 00:01:03.356 07:36:47 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721021807.AueRRr 00:01:03.356 07:36:47 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:01:03.356 07:36:47 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:01:03.356 07:36:47 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:03.356 07:36:47 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:03.356 07:36:47 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:03.356 07:36:47 -- common/autobuild_common.sh@460 -- $ get_config_params 00:01:03.356 07:36:47 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:03.356 07:36:47 -- common/autotest_common.sh@10 -- $ set +x 00:01:03.356 07:36:48 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:03.356 07:36:48 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:01:03.356 07:36:48 -- pm/common@17 -- $ local monitor 00:01:03.356 07:36:48 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.356 07:36:48 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.356 07:36:48 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.356 07:36:48 -- pm/common@21 -- $ date +%s 00:01:03.356 07:36:48 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.356 07:36:48 -- pm/common@25 -- $ sleep 1 00:01:03.356 07:36:48 -- pm/common@21 -- $ date +%s 00:01:03.356 07:36:48 -- pm/common@21 -- $ date +%s 00:01:03.356 07:36:48 -- pm/common@21 -- $ date +%s 00:01:03.356 07:36:48 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721021808 00:01:03.356 07:36:48 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721021808 00:01:03.356 07:36:48 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721021808 00:01:03.356 07:36:48 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721021808 00:01:03.356 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721021808_collect-vmstat.pm.log 00:01:03.356 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721021808_collect-cpu-load.pm.log 00:01:03.356 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721021808_collect-cpu-temp.pm.log 00:01:03.356 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721021808_collect-bmc-pm.bmc.pm.log 00:01:04.303 07:36:49 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:01:04.303 07:36:49 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:04.303 07:36:49 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:04.303 07:36:49 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:04.303 07:36:49 -- spdk/autobuild.sh@16 -- $ date -u 00:01:04.303 Mon Jul 15 05:36:49 AM UTC 2024 00:01:04.303 07:36:49 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:04.303 v24.09-pre-203-g897e912d5 00:01:04.303 07:36:49 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:04.303 07:36:49 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:04.303 07:36:49 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:04.303 07:36:49 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:04.303 07:36:49 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:04.303 07:36:49 -- common/autotest_common.sh@10 -- $ set +x 00:01:04.564 ************************************ 00:01:04.564 START TEST ubsan 00:01:04.564 ************************************ 00:01:04.564 07:36:49 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:04.564 using ubsan 00:01:04.564 00:01:04.564 real 0m0.001s 00:01:04.564 user 0m0.000s 00:01:04.564 sys 0m0.000s 00:01:04.564 07:36:49 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:04.564 07:36:49 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:04.564 ************************************ 00:01:04.564 END TEST ubsan 00:01:04.564 ************************************ 00:01:04.564 07:36:49 -- common/autotest_common.sh@1142 -- $ return 0 00:01:04.564 07:36:49 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:04.564 07:36:49 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:04.564 07:36:49 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:04.564 07:36:49 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:04.564 07:36:49 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:04.564 07:36:49 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:04.564 07:36:49 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:04.564 07:36:49 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:04.564 07:36:49 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:04.564 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:04.564 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:05.142 Using 'verbs' RDMA provider 00:01:20.988 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:33.215 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:33.215 Creating mk/config.mk...done. 00:01:33.215 Creating mk/cc.flags.mk...done. 00:01:33.215 Type 'make' to build. 00:01:33.215 07:37:17 -- spdk/autobuild.sh@69 -- $ run_test make make -j128 00:01:33.215 07:37:17 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:33.215 07:37:17 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:33.215 07:37:17 -- common/autotest_common.sh@10 -- $ set +x 00:01:33.215 ************************************ 00:01:33.215 START TEST make 00:01:33.215 ************************************ 00:01:33.215 07:37:17 make -- common/autotest_common.sh@1123 -- $ make -j128 00:01:33.787 make[1]: Nothing to be done for 'all'. 00:02:12.545 The Meson build system 00:02:12.545 Version: 1.3.1 00:02:12.545 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:12.545 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:12.545 Build type: native build 00:02:12.545 Program cat found: YES (/usr/bin/cat) 00:02:12.545 Project name: DPDK 00:02:12.545 Project version: 24.03.0 00:02:12.545 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:12.545 C linker for the host machine: cc ld.bfd 2.39-16 00:02:12.545 Host machine cpu family: x86_64 00:02:12.545 Host machine cpu: x86_64 00:02:12.545 Message: ## Building in Developer Mode ## 00:02:12.545 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:12.545 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:12.545 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:12.545 Program python3 found: YES (/usr/bin/python3) 00:02:12.545 Program cat found: YES (/usr/bin/cat) 00:02:12.545 Compiler for C supports arguments -march=native: YES 00:02:12.545 Checking for size of "void *" : 8 00:02:12.545 Checking for size of "void *" : 8 (cached) 00:02:12.545 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:12.545 Library m found: YES 00:02:12.545 Library numa found: YES 00:02:12.545 Has header "numaif.h" : YES 00:02:12.545 Library fdt found: NO 00:02:12.545 Library execinfo found: NO 00:02:12.545 Has header "execinfo.h" : YES 00:02:12.545 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:12.545 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:12.545 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:12.545 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:12.545 Run-time dependency openssl found: YES 3.0.9 00:02:12.545 Run-time dependency libpcap found: YES 1.10.4 00:02:12.545 Has header "pcap.h" with dependency libpcap: YES 00:02:12.545 Compiler for C supports arguments -Wcast-qual: YES 00:02:12.545 Compiler for C supports arguments -Wdeprecated: YES 00:02:12.545 Compiler for C supports arguments -Wformat: YES 00:02:12.545 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:12.545 Compiler for C supports arguments -Wformat-security: NO 00:02:12.545 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:12.545 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:12.545 Compiler for C supports arguments -Wnested-externs: YES 00:02:12.545 Compiler for C supports arguments -Wold-style-definition: YES 00:02:12.545 Compiler for C supports arguments -Wpointer-arith: YES 00:02:12.545 Compiler for C supports arguments -Wsign-compare: YES 00:02:12.545 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:12.545 Compiler for C supports arguments -Wundef: YES 00:02:12.545 Compiler for C supports arguments -Wwrite-strings: YES 00:02:12.545 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:12.545 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:12.545 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:12.545 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:12.545 Program objdump found: YES (/usr/bin/objdump) 00:02:12.545 Compiler for C supports arguments -mavx512f: YES 00:02:12.545 Checking if "AVX512 checking" compiles: YES 00:02:12.545 Fetching value of define "__SSE4_2__" : 1 00:02:12.545 Fetching value of define "__AES__" : 1 00:02:12.545 Fetching value of define "__AVX__" : 1 00:02:12.545 Fetching value of define "__AVX2__" : 1 00:02:12.545 Fetching value of define "__AVX512BW__" : 1 00:02:12.545 Fetching value of define "__AVX512CD__" : 1 00:02:12.545 Fetching value of define "__AVX512DQ__" : 1 00:02:12.545 Fetching value of define "__AVX512F__" : 1 00:02:12.545 Fetching value of define "__AVX512VL__" : 1 00:02:12.545 Fetching value of define "__PCLMUL__" : 1 00:02:12.545 Fetching value of define "__RDRND__" : 1 00:02:12.545 Fetching value of define "__RDSEED__" : 1 00:02:12.545 Fetching value of define "__VPCLMULQDQ__" : 1 00:02:12.545 Fetching value of define "__znver1__" : (undefined) 00:02:12.545 Fetching value of define "__znver2__" : (undefined) 00:02:12.545 Fetching value of define "__znver3__" : (undefined) 00:02:12.545 Fetching value of define "__znver4__" : (undefined) 00:02:12.545 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:12.545 Message: lib/log: Defining dependency "log" 00:02:12.545 Message: lib/kvargs: Defining dependency "kvargs" 00:02:12.545 Message: lib/telemetry: Defining dependency "telemetry" 00:02:12.545 Checking for function "getentropy" : NO 00:02:12.545 Message: lib/eal: Defining dependency "eal" 00:02:12.545 Message: lib/ring: Defining dependency "ring" 00:02:12.545 Message: lib/rcu: Defining dependency "rcu" 00:02:12.545 Message: lib/mempool: Defining dependency "mempool" 00:02:12.545 Message: lib/mbuf: Defining dependency "mbuf" 00:02:12.545 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:12.545 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:12.545 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:12.545 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:12.545 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:12.545 Fetching value of define "__VPCLMULQDQ__" : 1 (cached) 00:02:12.545 Compiler for C supports arguments -mpclmul: YES 00:02:12.545 Compiler for C supports arguments -maes: YES 00:02:12.545 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:12.545 Compiler for C supports arguments -mavx512bw: YES 00:02:12.545 Compiler for C supports arguments -mavx512dq: YES 00:02:12.545 Compiler for C supports arguments -mavx512vl: YES 00:02:12.545 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:12.545 Compiler for C supports arguments -mavx2: YES 00:02:12.545 Compiler for C supports arguments -mavx: YES 00:02:12.545 Message: lib/net: Defining dependency "net" 00:02:12.545 Message: lib/meter: Defining dependency "meter" 00:02:12.545 Message: lib/ethdev: Defining dependency "ethdev" 00:02:12.545 Message: lib/pci: Defining dependency "pci" 00:02:12.545 Message: lib/cmdline: Defining dependency "cmdline" 00:02:12.545 Message: lib/hash: Defining dependency "hash" 00:02:12.545 Message: lib/timer: Defining dependency "timer" 00:02:12.545 Message: lib/compressdev: Defining dependency "compressdev" 00:02:12.545 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:12.545 Message: lib/dmadev: Defining dependency "dmadev" 00:02:12.545 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:12.545 Message: lib/power: Defining dependency "power" 00:02:12.545 Message: lib/reorder: Defining dependency "reorder" 00:02:12.545 Message: lib/security: Defining dependency "security" 00:02:12.545 Has header "linux/userfaultfd.h" : YES 00:02:12.545 Has header "linux/vduse.h" : YES 00:02:12.545 Message: lib/vhost: Defining dependency "vhost" 00:02:12.545 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:12.545 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:12.545 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:12.545 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:12.545 Compiler for C supports arguments -std=c11: YES 00:02:12.545 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:12.545 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:12.545 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:12.545 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:12.545 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:12.545 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:12.545 Library mtcr_ul found: NO 00:02:12.545 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:12.545 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:14.502 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:14.502 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:14.502 Configuring mlx5_autoconf.h using configuration 00:02:14.502 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:14.502 Run-time dependency libcrypto found: YES 3.0.9 00:02:14.502 Library IPSec_MB found: YES 00:02:14.502 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:14.502 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:14.502 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:14.502 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:14.502 Library IPSec_MB found: YES 00:02:14.502 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:14.502 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:14.502 Compiler for C supports arguments -std=c11: YES (cached) 00:02:14.502 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:14.502 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:14.502 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:14.502 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:14.502 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:14.502 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:14.502 Library libisal found: NO 00:02:14.502 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:14.502 Compiler for C supports arguments -std=c11: YES (cached) 00:02:14.502 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:14.502 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:14.502 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:14.502 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:14.502 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:14.502 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:14.502 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:14.502 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:14.502 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:14.502 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:14.502 Program doxygen found: YES (/usr/bin/doxygen) 00:02:14.502 Configuring doxy-api-html.conf using configuration 00:02:14.502 Configuring doxy-api-man.conf using configuration 00:02:14.502 Program mandb found: YES (/usr/bin/mandb) 00:02:14.502 Program sphinx-build found: NO 00:02:14.502 Configuring rte_build_config.h using configuration 00:02:14.502 Message: 00:02:14.502 ================= 00:02:14.502 Applications Enabled 00:02:14.502 ================= 00:02:14.502 00:02:14.502 apps: 00:02:14.502 00:02:14.502 00:02:14.502 Message: 00:02:14.502 ================= 00:02:14.502 Libraries Enabled 00:02:14.502 ================= 00:02:14.502 00:02:14.502 libs: 00:02:14.502 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:14.502 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:14.502 cryptodev, dmadev, power, reorder, security, vhost, 00:02:14.502 00:02:14.502 Message: 00:02:14.502 =============== 00:02:14.502 Drivers Enabled 00:02:14.502 =============== 00:02:14.502 00:02:14.502 common: 00:02:14.502 mlx5, qat, 00:02:14.502 bus: 00:02:14.502 auxiliary, pci, vdev, 00:02:14.502 mempool: 00:02:14.502 ring, 00:02:14.502 dma: 00:02:14.502 00:02:14.502 net: 00:02:14.502 00:02:14.502 crypto: 00:02:14.502 ipsec_mb, mlx5, 00:02:14.502 compress: 00:02:14.502 isal, mlx5, 00:02:14.502 vdpa: 00:02:14.502 00:02:14.502 00:02:14.502 Message: 00:02:14.502 ================= 00:02:14.502 Content Skipped 00:02:14.502 ================= 00:02:14.502 00:02:14.502 apps: 00:02:14.502 dumpcap: explicitly disabled via build config 00:02:14.502 graph: explicitly disabled via build config 00:02:14.502 pdump: explicitly disabled via build config 00:02:14.502 proc-info: explicitly disabled via build config 00:02:14.502 test-acl: explicitly disabled via build config 00:02:14.502 test-bbdev: explicitly disabled via build config 00:02:14.502 test-cmdline: explicitly disabled via build config 00:02:14.502 test-compress-perf: explicitly disabled via build config 00:02:14.502 test-crypto-perf: explicitly disabled via build config 00:02:14.503 test-dma-perf: explicitly disabled via build config 00:02:14.503 test-eventdev: explicitly disabled via build config 00:02:14.503 test-fib: explicitly disabled via build config 00:02:14.503 test-flow-perf: explicitly disabled via build config 00:02:14.503 test-gpudev: explicitly disabled via build config 00:02:14.503 test-mldev: explicitly disabled via build config 00:02:14.503 test-pipeline: explicitly disabled via build config 00:02:14.503 test-pmd: explicitly disabled via build config 00:02:14.503 test-regex: explicitly disabled via build config 00:02:14.503 test-sad: explicitly disabled via build config 00:02:14.503 test-security-perf: explicitly disabled via build config 00:02:14.503 00:02:14.503 libs: 00:02:14.503 argparse: explicitly disabled via build config 00:02:14.503 metrics: explicitly disabled via build config 00:02:14.503 acl: explicitly disabled via build config 00:02:14.503 bbdev: explicitly disabled via build config 00:02:14.503 bitratestats: explicitly disabled via build config 00:02:14.503 bpf: explicitly disabled via build config 00:02:14.503 cfgfile: explicitly disabled via build config 00:02:14.503 distributor: explicitly disabled via build config 00:02:14.503 efd: explicitly disabled via build config 00:02:14.503 eventdev: explicitly disabled via build config 00:02:14.503 dispatcher: explicitly disabled via build config 00:02:14.503 gpudev: explicitly disabled via build config 00:02:14.503 gro: explicitly disabled via build config 00:02:14.503 gso: explicitly disabled via build config 00:02:14.503 ip_frag: explicitly disabled via build config 00:02:14.503 jobstats: explicitly disabled via build config 00:02:14.503 latencystats: explicitly disabled via build config 00:02:14.503 lpm: explicitly disabled via build config 00:02:14.503 member: explicitly disabled via build config 00:02:14.503 pcapng: explicitly disabled via build config 00:02:14.503 rawdev: explicitly disabled via build config 00:02:14.503 regexdev: explicitly disabled via build config 00:02:14.503 mldev: explicitly disabled via build config 00:02:14.503 rib: explicitly disabled via build config 00:02:14.503 sched: explicitly disabled via build config 00:02:14.503 stack: explicitly disabled via build config 00:02:14.503 ipsec: explicitly disabled via build config 00:02:14.503 pdcp: explicitly disabled via build config 00:02:14.503 fib: explicitly disabled via build config 00:02:14.503 port: explicitly disabled via build config 00:02:14.503 pdump: explicitly disabled via build config 00:02:14.503 table: explicitly disabled via build config 00:02:14.503 pipeline: explicitly disabled via build config 00:02:14.503 graph: explicitly disabled via build config 00:02:14.503 node: explicitly disabled via build config 00:02:14.503 00:02:14.503 drivers: 00:02:14.503 common/cpt: not in enabled drivers build config 00:02:14.503 common/dpaax: not in enabled drivers build config 00:02:14.503 common/iavf: not in enabled drivers build config 00:02:14.503 common/idpf: not in enabled drivers build config 00:02:14.503 common/ionic: not in enabled drivers build config 00:02:14.503 common/mvep: not in enabled drivers build config 00:02:14.503 common/octeontx: not in enabled drivers build config 00:02:14.503 bus/cdx: not in enabled drivers build config 00:02:14.503 bus/dpaa: not in enabled drivers build config 00:02:14.503 bus/fslmc: not in enabled drivers build config 00:02:14.503 bus/ifpga: not in enabled drivers build config 00:02:14.503 bus/platform: not in enabled drivers build config 00:02:14.503 bus/uacce: not in enabled drivers build config 00:02:14.503 bus/vmbus: not in enabled drivers build config 00:02:14.503 common/cnxk: not in enabled drivers build config 00:02:14.503 common/nfp: not in enabled drivers build config 00:02:14.503 common/nitrox: not in enabled drivers build config 00:02:14.503 common/sfc_efx: not in enabled drivers build config 00:02:14.503 mempool/bucket: not in enabled drivers build config 00:02:14.503 mempool/cnxk: not in enabled drivers build config 00:02:14.503 mempool/dpaa: not in enabled drivers build config 00:02:14.503 mempool/dpaa2: not in enabled drivers build config 00:02:14.503 mempool/octeontx: not in enabled drivers build config 00:02:14.503 mempool/stack: not in enabled drivers build config 00:02:14.503 dma/cnxk: not in enabled drivers build config 00:02:14.503 dma/dpaa: not in enabled drivers build config 00:02:14.503 dma/dpaa2: not in enabled drivers build config 00:02:14.503 dma/hisilicon: not in enabled drivers build config 00:02:14.503 dma/idxd: not in enabled drivers build config 00:02:14.503 dma/ioat: not in enabled drivers build config 00:02:14.503 dma/skeleton: not in enabled drivers build config 00:02:14.503 net/af_packet: not in enabled drivers build config 00:02:14.503 net/af_xdp: not in enabled drivers build config 00:02:14.503 net/ark: not in enabled drivers build config 00:02:14.503 net/atlantic: not in enabled drivers build config 00:02:14.503 net/avp: not in enabled drivers build config 00:02:14.503 net/axgbe: not in enabled drivers build config 00:02:14.503 net/bnx2x: not in enabled drivers build config 00:02:14.503 net/bnxt: not in enabled drivers build config 00:02:14.503 net/bonding: not in enabled drivers build config 00:02:14.503 net/cnxk: not in enabled drivers build config 00:02:14.503 net/cpfl: not in enabled drivers build config 00:02:14.503 net/cxgbe: not in enabled drivers build config 00:02:14.503 net/dpaa: not in enabled drivers build config 00:02:14.503 net/dpaa2: not in enabled drivers build config 00:02:14.503 net/e1000: not in enabled drivers build config 00:02:14.503 net/ena: not in enabled drivers build config 00:02:14.503 net/enetc: not in enabled drivers build config 00:02:14.503 net/enetfec: not in enabled drivers build config 00:02:14.503 net/enic: not in enabled drivers build config 00:02:14.503 net/failsafe: not in enabled drivers build config 00:02:14.503 net/fm10k: not in enabled drivers build config 00:02:14.503 net/gve: not in enabled drivers build config 00:02:14.503 net/hinic: not in enabled drivers build config 00:02:14.503 net/hns3: not in enabled drivers build config 00:02:14.503 net/i40e: not in enabled drivers build config 00:02:14.503 net/iavf: not in enabled drivers build config 00:02:14.503 net/ice: not in enabled drivers build config 00:02:14.503 net/idpf: not in enabled drivers build config 00:02:14.503 net/igc: not in enabled drivers build config 00:02:14.503 net/ionic: not in enabled drivers build config 00:02:14.503 net/ipn3ke: not in enabled drivers build config 00:02:14.503 net/ixgbe: not in enabled drivers build config 00:02:14.503 net/mana: not in enabled drivers build config 00:02:14.503 net/memif: not in enabled drivers build config 00:02:14.503 net/mlx4: not in enabled drivers build config 00:02:14.503 net/mlx5: not in enabled drivers build config 00:02:14.503 net/mvneta: not in enabled drivers build config 00:02:14.503 net/mvpp2: not in enabled drivers build config 00:02:14.503 net/netvsc: not in enabled drivers build config 00:02:14.503 net/nfb: not in enabled drivers build config 00:02:14.503 net/nfp: not in enabled drivers build config 00:02:14.503 net/ngbe: not in enabled drivers build config 00:02:14.503 net/null: not in enabled drivers build config 00:02:14.503 net/octeontx: not in enabled drivers build config 00:02:14.503 net/octeon_ep: not in enabled drivers build config 00:02:14.503 net/pcap: not in enabled drivers build config 00:02:14.503 net/pfe: not in enabled drivers build config 00:02:14.503 net/qede: not in enabled drivers build config 00:02:14.503 net/ring: not in enabled drivers build config 00:02:14.503 net/sfc: not in enabled drivers build config 00:02:14.503 net/softnic: not in enabled drivers build config 00:02:14.503 net/tap: not in enabled drivers build config 00:02:14.503 net/thunderx: not in enabled drivers build config 00:02:14.503 net/txgbe: not in enabled drivers build config 00:02:14.503 net/vdev_netvsc: not in enabled drivers build config 00:02:14.503 net/vhost: not in enabled drivers build config 00:02:14.503 net/virtio: not in enabled drivers build config 00:02:14.503 net/vmxnet3: not in enabled drivers build config 00:02:14.503 raw/*: missing internal dependency, "rawdev" 00:02:14.503 crypto/armv8: not in enabled drivers build config 00:02:14.503 crypto/bcmfs: not in enabled drivers build config 00:02:14.503 crypto/caam_jr: not in enabled drivers build config 00:02:14.503 crypto/ccp: not in enabled drivers build config 00:02:14.503 crypto/cnxk: not in enabled drivers build config 00:02:14.503 crypto/dpaa_sec: not in enabled drivers build config 00:02:14.503 crypto/dpaa2_sec: not in enabled drivers build config 00:02:14.503 crypto/mvsam: not in enabled drivers build config 00:02:14.503 crypto/nitrox: not in enabled drivers build config 00:02:14.503 crypto/null: not in enabled drivers build config 00:02:14.503 crypto/octeontx: not in enabled drivers build config 00:02:14.503 crypto/openssl: not in enabled drivers build config 00:02:14.503 crypto/scheduler: not in enabled drivers build config 00:02:14.503 crypto/uadk: not in enabled drivers build config 00:02:14.503 crypto/virtio: not in enabled drivers build config 00:02:14.503 compress/nitrox: not in enabled drivers build config 00:02:14.503 compress/octeontx: not in enabled drivers build config 00:02:14.503 compress/zlib: not in enabled drivers build config 00:02:14.503 regex/*: missing internal dependency, "regexdev" 00:02:14.503 ml/*: missing internal dependency, "mldev" 00:02:14.503 vdpa/ifc: not in enabled drivers build config 00:02:14.503 vdpa/mlx5: not in enabled drivers build config 00:02:14.503 vdpa/nfp: not in enabled drivers build config 00:02:14.503 vdpa/sfc: not in enabled drivers build config 00:02:14.503 event/*: missing internal dependency, "eventdev" 00:02:14.503 baseband/*: missing internal dependency, "bbdev" 00:02:14.503 gpu/*: missing internal dependency, "gpudev" 00:02:14.503 00:02:14.503 00:02:15.889 Build targets in project: 114 00:02:15.889 00:02:15.889 DPDK 24.03.0 00:02:15.889 00:02:15.889 User defined options 00:02:15.889 buildtype : debug 00:02:15.889 default_library : shared 00:02:15.889 libdir : lib 00:02:15.889 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:15.889 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:15.889 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:15.889 cpu_instruction_set: native 00:02:15.889 disable_apps : test-dma-perf,test,test-sad,test-acl,test-pmd,test-mldev,test-compress-perf,test-cmdline,test-regex,test-fib,graph,test-bbdev,dumpcap,test-gpudev,proc-info,test-pipeline,test-flow-perf,test-crypto-perf,pdump,test-eventdev,test-security-perf 00:02:15.889 disable_libs : port,lpm,ipsec,regexdev,dispatcher,argparse,bitratestats,rawdev,stack,graph,acl,bbdev,pipeline,member,sched,pcapng,mldev,eventdev,efd,metrics,latencystats,cfgfile,ip_frag,jobstats,pdump,pdcp,rib,node,fib,distributor,gso,table,bpf,gpudev,gro 00:02:15.889 enable_docs : false 00:02:15.889 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:15.889 enable_kmods : false 00:02:15.889 max_lcores : 128 00:02:15.889 tests : false 00:02:15.889 00:02:15.889 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:16.151 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:16.428 [1/377] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:16.428 [2/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:16.428 [3/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:16.428 [4/377] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:16.428 [5/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:16.428 [6/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:16.428 [7/377] Linking static target lib/librte_kvargs.a 00:02:16.428 [8/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:16.428 [9/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:16.695 [10/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:16.695 [11/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:16.695 [12/377] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:16.695 [13/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:16.695 [14/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:16.695 [15/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:16.695 [16/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:16.695 [17/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:16.695 [18/377] Linking static target lib/librte_log.a 00:02:16.695 [19/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:16.695 [20/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:16.695 [21/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:16.695 [22/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:16.695 [23/377] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:16.695 [24/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:16.957 [25/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:16.957 [26/377] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:16.957 [27/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:16.957 [28/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:16.957 [29/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:16.957 [30/377] Linking static target lib/librte_pci.a 00:02:16.957 [31/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:16.957 [32/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:16.957 [33/377] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:16.957 [34/377] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:16.957 [35/377] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:17.222 [36/377] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:17.222 [37/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:17.222 [38/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:17.222 [39/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:17.222 [40/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:17.222 [41/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:17.222 [42/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:17.222 [43/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:17.483 [44/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:17.483 [45/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:17.483 [46/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:17.483 [47/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:17.483 [48/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:17.484 [49/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:17.484 [50/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:17.484 [51/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:17.484 [52/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:17.484 [53/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:17.484 [54/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:17.484 [55/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:17.484 [56/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:17.484 [57/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:17.484 [58/377] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:17.484 [59/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:17.484 [60/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:17.484 [61/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:17.484 [62/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:17.484 [63/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:17.484 [64/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:17.484 [65/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:17.484 [66/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:17.484 [67/377] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:17.484 [68/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:17.484 [69/377] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:17.484 [70/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:17.484 [71/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:17.484 [72/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:17.484 [73/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:17.484 [74/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:17.484 [75/377] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.484 [76/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:17.484 [77/377] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:17.484 [78/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:17.484 [79/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:17.484 [80/377] Compiling C object lib/librte_net.a.p/net_net_crc_avx512.c.o 00:02:17.484 [81/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:17.484 [82/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:17.484 [83/377] Linking static target lib/librte_telemetry.a 00:02:17.484 [84/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:17.484 [85/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:17.484 [86/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:17.484 [87/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:17.484 [88/377] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:17.484 [89/377] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:17.484 [90/377] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:17.484 [91/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:17.484 [92/377] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:17.484 [93/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:17.484 [94/377] Linking static target lib/librte_ring.a 00:02:17.745 [95/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:17.745 [96/377] Linking static target lib/librte_timer.a 00:02:17.745 [97/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:17.745 [98/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:17.745 [99/377] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.745 [100/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:17.745 [101/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:17.745 [102/377] Linking static target lib/librte_meter.a 00:02:17.745 [103/377] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:17.745 [104/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:17.745 [105/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:17.745 [106/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:17.745 [107/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:17.745 [108/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:17.745 [109/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:17.745 [110/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:17.745 [111/377] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:17.745 [112/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:17.745 [113/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:17.745 [114/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:17.745 [115/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:17.745 [116/377] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:17.745 [117/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:17.745 [118/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:17.745 [119/377] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:17.745 [120/377] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:17.745 [121/377] Linking static target lib/librte_cmdline.a 00:02:17.745 [122/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:17.745 [123/377] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:17.745 [124/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:17.745 [125/377] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:17.745 [126/377] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:17.745 [127/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:17.745 [128/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:17.745 [129/377] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:17.745 [130/377] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:17.745 [131/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:17.745 [132/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:18.004 [133/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:18.004 [134/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:18.004 [135/377] Linking static target lib/librte_net.a 00:02:18.004 [136/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:18.004 [137/377] Linking static target lib/librte_compressdev.a 00:02:18.004 [138/377] Linking static target lib/librte_rcu.a 00:02:18.004 [139/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:18.004 [140/377] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:18.004 [141/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:18.004 [142/377] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:18.004 [143/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:18.004 [144/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:18.004 [145/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:18.004 [146/377] Linking static target lib/librte_reorder.a 00:02:18.004 [147/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:18.004 [148/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:18.004 [149/377] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:18.004 [150/377] Linking static target lib/librte_hash.a 00:02:18.004 [151/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:18.004 [152/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:18.004 [153/377] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:18.004 [154/377] Linking static target lib/librte_security.a 00:02:18.004 [155/377] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:18.004 [156/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:18.004 [157/377] Linking static target lib/librte_power.a 00:02:18.004 [158/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:18.263 [159/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:18.263 [160/377] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:18.263 [161/377] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.263 [162/377] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.263 [163/377] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.263 [164/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:18.263 [165/377] Linking target lib/librte_log.so.24.1 00:02:18.263 [166/377] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.263 [167/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:18.263 [168/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:18.263 [169/377] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:18.263 [170/377] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:18.263 [171/377] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:18.263 [172/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:18.263 [173/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:18.263 [174/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:18.263 [175/377] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:18.263 [176/377] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.263 [177/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:18.263 [178/377] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.263 [179/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:18.263 [180/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:18.263 [181/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:18.263 [182/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:18.263 [183/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:18.263 [184/377] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:18.263 [185/377] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:18.263 [186/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:18.263 [187/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:18.263 [188/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:18.263 [189/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:18.263 [190/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:18.263 [191/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:18.263 [192/377] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.263 [193/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:18.263 [194/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:18.263 [195/377] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.263 [196/377] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:18.263 [197/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:18.263 [198/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:18.263 [199/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:18.263 [200/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:18.263 [201/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:18.263 [202/377] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:18.524 [203/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:18.524 [204/377] Linking target lib/librte_kvargs.so.24.1 00:02:18.524 [205/377] Linking target lib/librte_telemetry.so.24.1 00:02:18.524 [206/377] Linking static target lib/librte_mempool.a 00:02:18.524 [207/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:18.524 [208/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:18.524 [209/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:18.524 [210/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:18.524 [211/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:18.524 [212/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:18.524 [213/377] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:18.524 [214/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:18.524 [215/377] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:18.524 [216/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:18.524 [217/377] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:18.524 [218/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:18.524 [219/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:18.524 [220/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:18.524 [221/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:18.524 [222/377] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:18.524 [223/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:18.524 [224/377] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:18.524 [225/377] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:18.524 [226/377] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.524 [227/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:18.524 [228/377] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:18.524 [229/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:18.524 [230/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:18.524 [231/377] Linking static target drivers/librte_bus_vdev.a 00:02:18.524 [232/377] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:18.524 [233/377] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:18.524 [234/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:18.524 [235/377] Linking static target drivers/librte_bus_auxiliary.a 00:02:18.524 [236/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:18.524 [237/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:18.524 [238/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:18.524 [239/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:18.524 [240/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:18.524 [241/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:18.524 [242/377] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:18.524 [243/377] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:18.524 [244/377] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:18.524 [245/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:18.524 [246/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:18.524 [247/377] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.524 [248/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:18.524 [249/377] Linking static target drivers/librte_bus_pci.a 00:02:18.524 [250/377] Linking static target lib/librte_eal.a 00:02:18.524 [251/377] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:18.524 [252/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:18.524 [253/377] Linking static target lib/librte_cryptodev.a 00:02:18.524 [254/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:18.524 [255/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:18.524 [256/377] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:18.524 [257/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:18.524 [258/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:18.524 [259/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:18.524 [260/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:18.524 [261/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:18.524 [262/377] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:18.524 [263/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:18.524 [264/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:18.524 [265/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:18.524 [266/377] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:18.524 [267/377] Linking static target lib/librte_dmadev.a 00:02:18.524 [268/377] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:18.524 [269/377] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:18.524 [270/377] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:18.524 [271/377] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:18.785 [272/377] Linking static target drivers/librte_mempool_ring.a 00:02:18.785 [273/377] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:18.785 [274/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:18.785 [275/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:18.785 [276/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:18.785 [277/377] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:18.785 [278/377] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:18.785 [279/377] Linking static target drivers/librte_compress_isal.a 00:02:18.785 [280/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:18.785 [281/377] Linking static target lib/librte_ethdev.a 00:02:18.785 [282/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:18.785 [283/377] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:18.785 [284/377] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.785 [285/377] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:18.785 [286/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:18.785 [287/377] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:18.785 [288/377] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.785 [289/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:18.785 [290/377] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.785 [291/377] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:18.785 [292/377] Linking static target drivers/librte_crypto_mlx5.a 00:02:18.785 [293/377] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:18.785 [294/377] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:18.785 [295/377] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:18.785 [296/377] Linking static target drivers/librte_compress_mlx5.a 00:02:18.785 [297/377] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.785 [298/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:19.044 [299/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:19.044 [300/377] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:19.044 [301/377] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.044 [302/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:19.044 [303/377] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:19.044 [304/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:19.044 [305/377] Linking static target lib/librte_mbuf.a 00:02:19.044 [306/377] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:19.044 [307/377] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:19.045 [308/377] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:19.045 [309/377] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:19.304 [310/377] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:19.304 [311/377] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:19.304 [312/377] Linking static target drivers/librte_common_mlx5.a 00:02:19.304 [313/377] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.304 [314/377] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.304 [315/377] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.562 [316/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:19.563 [317/377] Linking static target drivers/libtmp_rte_common_qat.a 00:02:19.822 [318/377] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:19.822 [319/377] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:19.822 [320/377] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:19.822 [321/377] Linking static target drivers/librte_common_qat.a 00:02:20.081 [322/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:20.081 [323/377] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:20.081 [324/377] Linking static target lib/librte_vhost.a 00:02:20.653 [325/377] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.569 [326/377] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.116 [327/377] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.413 [328/377] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.324 [329/377] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.324 [330/377] Linking target lib/librte_eal.so.24.1 00:02:30.585 [331/377] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:30.585 [332/377] Linking target lib/librte_ring.so.24.1 00:02:30.585 [333/377] Linking target lib/librte_meter.so.24.1 00:02:30.585 [334/377] Linking target lib/librte_pci.so.24.1 00:02:30.585 [335/377] Linking target lib/librte_dmadev.so.24.1 00:02:30.585 [336/377] Linking target drivers/librte_bus_vdev.so.24.1 00:02:30.585 [337/377] Linking target lib/librte_timer.so.24.1 00:02:30.585 [338/377] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:30.846 [339/377] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:30.846 [340/377] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:30.846 [341/377] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:30.846 [342/377] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:30.846 [343/377] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:30.846 [344/377] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:30.846 [345/377] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:30.846 [346/377] Linking target lib/librte_rcu.so.24.1 00:02:30.846 [347/377] Linking target drivers/librte_bus_pci.so.24.1 00:02:30.846 [348/377] Linking target lib/librte_mempool.so.24.1 00:02:30.846 [349/377] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:30.846 [350/377] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:30.846 [351/377] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:31.106 [352/377] Linking target drivers/librte_mempool_ring.so.24.1 00:02:31.106 [353/377] Linking target lib/librte_mbuf.so.24.1 00:02:31.106 [354/377] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:31.106 [355/377] Linking target lib/librte_reorder.so.24.1 00:02:31.106 [356/377] Linking target lib/librte_compressdev.so.24.1 00:02:31.106 [357/377] Linking target lib/librte_net.so.24.1 00:02:31.106 [358/377] Linking target lib/librte_cryptodev.so.24.1 00:02:31.366 [359/377] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:31.366 [360/377] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:31.366 [361/377] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:31.366 [362/377] Linking target lib/librte_hash.so.24.1 00:02:31.366 [363/377] Linking target lib/librte_security.so.24.1 00:02:31.366 [364/377] Linking target lib/librte_cmdline.so.24.1 00:02:31.366 [365/377] Linking target drivers/librte_compress_isal.so.24.1 00:02:31.366 [366/377] Linking target lib/librte_ethdev.so.24.1 00:02:31.631 [367/377] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:31.632 [368/377] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:31.632 [369/377] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:31.632 [370/377] Linking target drivers/librte_common_mlx5.so.24.1 00:02:31.632 [371/377] Linking target lib/librte_power.so.24.1 00:02:31.632 [372/377] Linking target lib/librte_vhost.so.24.1 00:02:31.892 [373/377] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:31.892 [374/377] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:31.892 [375/377] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:31.892 [376/377] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:31.892 [377/377] Linking target drivers/librte_common_qat.so.24.1 00:02:31.892 INFO: autodetecting backend as ninja 00:02:31.892 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 128 00:02:33.277 CC lib/log/log.o 00:02:33.277 CC lib/log/log_flags.o 00:02:33.277 CC lib/log/log_deprecated.o 00:02:33.277 CC lib/ut/ut.o 00:02:33.277 CC lib/ut_mock/mock.o 00:02:33.277 LIB libspdk_log.a 00:02:33.277 LIB libspdk_ut.a 00:02:33.277 SO libspdk_log.so.7.0 00:02:33.538 SO libspdk_ut.so.2.0 00:02:33.538 SYMLINK libspdk_ut.so 00:02:33.538 SYMLINK libspdk_log.so 00:02:33.538 LIB libspdk_ut_mock.a 00:02:33.538 SO libspdk_ut_mock.so.6.0 00:02:33.798 SYMLINK libspdk_ut_mock.so 00:02:33.798 CC lib/util/base64.o 00:02:33.798 CC lib/util/bit_array.o 00:02:33.798 CXX lib/trace_parser/trace.o 00:02:33.798 CC lib/util/cpuset.o 00:02:33.798 CC lib/util/crc16.o 00:02:33.798 CC lib/util/crc32_ieee.o 00:02:33.798 CC lib/util/crc32.o 00:02:33.798 CC lib/util/crc32c.o 00:02:33.798 CC lib/util/crc64.o 00:02:33.798 CC lib/ioat/ioat.o 00:02:33.798 CC lib/util/dif.o 00:02:33.798 CC lib/util/file.o 00:02:33.798 CC lib/util/fd.o 00:02:33.798 CC lib/util/hexlify.o 00:02:33.798 CC lib/util/iov.o 00:02:33.798 CC lib/dma/dma.o 00:02:33.798 CC lib/util/math.o 00:02:33.798 CC lib/util/pipe.o 00:02:33.798 CC lib/util/strerror_tls.o 00:02:33.798 CC lib/util/string.o 00:02:33.798 CC lib/util/uuid.o 00:02:33.798 CC lib/util/fd_group.o 00:02:33.798 CC lib/util/xor.o 00:02:33.798 CC lib/util/zipf.o 00:02:34.059 CC lib/vfio_user/host/vfio_user.o 00:02:34.059 CC lib/vfio_user/host/vfio_user_pci.o 00:02:34.059 LIB libspdk_dma.a 00:02:34.059 SO libspdk_dma.so.4.0 00:02:34.059 LIB libspdk_ioat.a 00:02:34.059 SO libspdk_ioat.so.7.0 00:02:34.059 SYMLINK libspdk_dma.so 00:02:34.319 SYMLINK libspdk_ioat.so 00:02:34.319 LIB libspdk_vfio_user.a 00:02:34.319 LIB libspdk_util.a 00:02:34.319 SO libspdk_vfio_user.so.5.0 00:02:34.319 SO libspdk_util.so.9.1 00:02:34.319 SYMLINK libspdk_vfio_user.so 00:02:34.580 LIB libspdk_trace_parser.a 00:02:34.580 SO libspdk_trace_parser.so.5.0 00:02:34.839 SYMLINK libspdk_util.so 00:02:34.839 SYMLINK libspdk_trace_parser.so 00:02:35.100 CC lib/rdma_utils/rdma_utils.o 00:02:35.100 CC lib/conf/conf.o 00:02:35.100 CC lib/idxd/idxd.o 00:02:35.100 CC lib/idxd/idxd_user.o 00:02:35.100 CC lib/json/json_parse.o 00:02:35.100 CC lib/vmd/vmd.o 00:02:35.101 CC lib/idxd/idxd_kernel.o 00:02:35.101 CC lib/json/json_util.o 00:02:35.101 CC lib/vmd/led.o 00:02:35.101 CC lib/json/json_write.o 00:02:35.101 CC lib/reduce/reduce.o 00:02:35.101 CC lib/env_dpdk/env.o 00:02:35.101 CC lib/env_dpdk/memory.o 00:02:35.101 CC lib/rdma_provider/common.o 00:02:35.101 CC lib/env_dpdk/pci.o 00:02:35.101 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:35.101 CC lib/env_dpdk/init.o 00:02:35.101 CC lib/env_dpdk/threads.o 00:02:35.101 CC lib/env_dpdk/pci_ioat.o 00:02:35.101 CC lib/env_dpdk/pci_virtio.o 00:02:35.101 CC lib/env_dpdk/pci_vmd.o 00:02:35.101 CC lib/env_dpdk/pci_idxd.o 00:02:35.101 CC lib/env_dpdk/pci_event.o 00:02:35.101 CC lib/env_dpdk/sigbus_handler.o 00:02:35.101 CC lib/env_dpdk/pci_dpdk.o 00:02:35.101 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:35.101 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:35.362 LIB libspdk_rdma_provider.a 00:02:35.362 LIB libspdk_conf.a 00:02:35.362 SO libspdk_rdma_provider.so.6.0 00:02:35.362 LIB libspdk_rdma_utils.a 00:02:35.362 SO libspdk_conf.so.6.0 00:02:35.362 SO libspdk_rdma_utils.so.1.0 00:02:35.362 SYMLINK libspdk_rdma_provider.so 00:02:35.362 LIB libspdk_json.a 00:02:35.362 SYMLINK libspdk_conf.so 00:02:35.362 SO libspdk_json.so.6.0 00:02:35.362 SYMLINK libspdk_rdma_utils.so 00:02:35.623 SYMLINK libspdk_json.so 00:02:35.623 LIB libspdk_idxd.a 00:02:35.623 SO libspdk_idxd.so.12.0 00:02:35.623 LIB libspdk_vmd.a 00:02:35.623 LIB libspdk_reduce.a 00:02:35.623 SYMLINK libspdk_idxd.so 00:02:35.623 SO libspdk_vmd.so.6.0 00:02:35.623 SO libspdk_reduce.so.6.0 00:02:35.884 SYMLINK libspdk_reduce.so 00:02:35.884 SYMLINK libspdk_vmd.so 00:02:35.884 CC lib/jsonrpc/jsonrpc_server.o 00:02:35.884 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:35.884 CC lib/jsonrpc/jsonrpc_client.o 00:02:35.884 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:36.144 LIB libspdk_env_dpdk.a 00:02:36.404 SO libspdk_env_dpdk.so.14.1 00:02:36.404 LIB libspdk_jsonrpc.a 00:02:36.404 SO libspdk_jsonrpc.so.6.0 00:02:36.404 SYMLINK libspdk_env_dpdk.so 00:02:36.664 SYMLINK libspdk_jsonrpc.so 00:02:36.924 CC lib/rpc/rpc.o 00:02:37.494 LIB libspdk_rpc.a 00:02:37.494 SO libspdk_rpc.so.6.0 00:02:37.494 SYMLINK libspdk_rpc.so 00:02:37.755 CC lib/trace/trace.o 00:02:37.755 CC lib/notify/notify.o 00:02:37.755 CC lib/trace/trace_flags.o 00:02:37.755 CC lib/notify/notify_rpc.o 00:02:37.755 CC lib/trace/trace_rpc.o 00:02:37.755 CC lib/keyring/keyring.o 00:02:37.755 CC lib/keyring/keyring_rpc.o 00:02:38.015 LIB libspdk_keyring.a 00:02:38.015 LIB libspdk_trace.a 00:02:38.015 SO libspdk_keyring.so.1.0 00:02:38.015 SO libspdk_trace.so.10.0 00:02:38.015 LIB libspdk_notify.a 00:02:38.275 SYMLINK libspdk_keyring.so 00:02:38.275 SO libspdk_notify.so.6.0 00:02:38.275 SYMLINK libspdk_trace.so 00:02:38.275 SYMLINK libspdk_notify.so 00:02:38.536 CC lib/thread/thread.o 00:02:38.536 CC lib/thread/iobuf.o 00:02:38.536 CC lib/sock/sock.o 00:02:38.536 CC lib/sock/sock_rpc.o 00:02:38.796 LIB libspdk_sock.a 00:02:38.796 SO libspdk_sock.so.10.0 00:02:39.057 SYMLINK libspdk_sock.so 00:02:39.317 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:39.317 CC lib/nvme/nvme_ctrlr.o 00:02:39.317 CC lib/nvme/nvme_fabric.o 00:02:39.317 CC lib/nvme/nvme_ns_cmd.o 00:02:39.317 CC lib/nvme/nvme_ns.o 00:02:39.317 CC lib/nvme/nvme_pcie_common.o 00:02:39.318 CC lib/nvme/nvme_pcie.o 00:02:39.318 CC lib/nvme/nvme_qpair.o 00:02:39.318 CC lib/nvme/nvme.o 00:02:39.318 CC lib/nvme/nvme_transport.o 00:02:39.318 CC lib/nvme/nvme_quirks.o 00:02:39.318 CC lib/nvme/nvme_discovery.o 00:02:39.318 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:39.318 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:39.318 CC lib/nvme/nvme_tcp.o 00:02:39.318 CC lib/nvme/nvme_opal.o 00:02:39.318 CC lib/nvme/nvme_io_msg.o 00:02:39.318 CC lib/nvme/nvme_poll_group.o 00:02:39.318 CC lib/nvme/nvme_zns.o 00:02:39.318 CC lib/nvme/nvme_stubs.o 00:02:39.318 CC lib/nvme/nvme_auth.o 00:02:39.318 CC lib/nvme/nvme_cuse.o 00:02:39.318 CC lib/nvme/nvme_rdma.o 00:02:41.862 LIB libspdk_thread.a 00:02:41.862 SO libspdk_thread.so.10.1 00:02:42.124 SYMLINK libspdk_thread.so 00:02:42.384 CC lib/accel/accel.o 00:02:42.384 CC lib/accel/accel_sw.o 00:02:42.384 CC lib/init/json_config.o 00:02:42.384 CC lib/accel/accel_rpc.o 00:02:42.384 CC lib/blob/blobstore.o 00:02:42.384 CC lib/init/subsystem.o 00:02:42.384 CC lib/blob/request.o 00:02:42.384 CC lib/init/subsystem_rpc.o 00:02:42.384 CC lib/blob/zeroes.o 00:02:42.384 CC lib/virtio/virtio.o 00:02:42.384 CC lib/blob/blob_bs_dev.o 00:02:42.384 CC lib/init/rpc.o 00:02:42.384 CC lib/virtio/virtio_vhost_user.o 00:02:42.384 CC lib/virtio/virtio_vfio_user.o 00:02:42.384 CC lib/virtio/virtio_pci.o 00:02:42.646 LIB libspdk_init.a 00:02:42.646 SO libspdk_init.so.5.0 00:02:42.646 LIB libspdk_virtio.a 00:02:42.906 SO libspdk_virtio.so.7.0 00:02:42.906 SYMLINK libspdk_init.so 00:02:42.906 SYMLINK libspdk_virtio.so 00:02:43.166 CC lib/event/app.o 00:02:43.166 CC lib/event/reactor.o 00:02:43.166 CC lib/event/log_rpc.o 00:02:43.166 CC lib/event/app_rpc.o 00:02:43.166 CC lib/event/scheduler_static.o 00:02:43.427 LIB libspdk_event.a 00:02:43.687 SO libspdk_event.so.14.0 00:02:43.687 SYMLINK libspdk_event.so 00:02:43.947 LIB libspdk_nvme.a 00:02:43.947 SO libspdk_nvme.so.13.1 00:02:44.518 SYMLINK libspdk_nvme.so 00:02:44.518 LIB libspdk_accel.a 00:02:44.518 SO libspdk_accel.so.15.1 00:02:44.518 SYMLINK libspdk_accel.so 00:02:44.779 LIB libspdk_blob.a 00:02:44.779 SO libspdk_blob.so.11.0 00:02:45.040 SYMLINK libspdk_blob.so 00:02:45.040 CC lib/bdev/bdev.o 00:02:45.040 CC lib/bdev/bdev_rpc.o 00:02:45.040 CC lib/bdev/bdev_zone.o 00:02:45.040 CC lib/bdev/part.o 00:02:45.040 CC lib/bdev/scsi_nvme.o 00:02:45.301 CC lib/blobfs/blobfs.o 00:02:45.301 CC lib/blobfs/tree.o 00:02:45.301 CC lib/lvol/lvol.o 00:02:45.874 LIB libspdk_blobfs.a 00:02:46.134 SO libspdk_blobfs.so.10.0 00:02:46.134 LIB libspdk_lvol.a 00:02:46.134 SYMLINK libspdk_blobfs.so 00:02:46.134 SO libspdk_lvol.so.10.0 00:02:46.134 SYMLINK libspdk_lvol.so 00:02:47.074 LIB libspdk_bdev.a 00:02:47.074 SO libspdk_bdev.so.15.1 00:02:47.074 SYMLINK libspdk_bdev.so 00:02:47.731 CC lib/scsi/dev.o 00:02:47.731 CC lib/scsi/lun.o 00:02:47.731 CC lib/nvmf/ctrlr.o 00:02:47.731 CC lib/nvmf/ctrlr_discovery.o 00:02:47.731 CC lib/scsi/port.o 00:02:47.731 CC lib/scsi/scsi.o 00:02:47.731 CC lib/nvmf/ctrlr_bdev.o 00:02:47.731 CC lib/scsi/scsi_bdev.o 00:02:47.731 CC lib/nvmf/subsystem.o 00:02:47.731 CC lib/scsi/scsi_pr.o 00:02:47.732 CC lib/nbd/nbd.o 00:02:47.732 CC lib/scsi/scsi_rpc.o 00:02:47.732 CC lib/nvmf/nvmf.o 00:02:47.732 CC lib/nbd/nbd_rpc.o 00:02:47.732 CC lib/scsi/task.o 00:02:47.732 CC lib/nvmf/nvmf_rpc.o 00:02:47.732 CC lib/ublk/ublk.o 00:02:47.732 CC lib/nvmf/transport.o 00:02:47.732 CC lib/nvmf/tcp.o 00:02:47.732 CC lib/ublk/ublk_rpc.o 00:02:47.732 CC lib/nvmf/stubs.o 00:02:47.732 CC lib/nvmf/mdns_server.o 00:02:47.732 CC lib/ftl/ftl_core.o 00:02:47.732 CC lib/nvmf/rdma.o 00:02:47.732 CC lib/ftl/ftl_init.o 00:02:47.732 CC lib/nvmf/auth.o 00:02:47.732 CC lib/ftl/ftl_layout.o 00:02:47.732 CC lib/ftl/ftl_debug.o 00:02:47.732 CC lib/ftl/ftl_io.o 00:02:47.732 CC lib/ftl/ftl_sb.o 00:02:47.732 CC lib/ftl/ftl_l2p.o 00:02:47.732 CC lib/ftl/ftl_l2p_flat.o 00:02:47.732 CC lib/ftl/ftl_nv_cache.o 00:02:47.732 CC lib/ftl/ftl_band.o 00:02:47.732 CC lib/ftl/ftl_band_ops.o 00:02:47.732 CC lib/ftl/ftl_writer.o 00:02:47.732 CC lib/ftl/ftl_rq.o 00:02:47.732 CC lib/ftl/ftl_reloc.o 00:02:47.732 CC lib/ftl/ftl_l2p_cache.o 00:02:47.732 CC lib/ftl/ftl_p2l.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:47.732 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:47.732 CC lib/ftl/utils/ftl_conf.o 00:02:47.732 CC lib/ftl/utils/ftl_md.o 00:02:47.732 CC lib/ftl/utils/ftl_mempool.o 00:02:47.732 CC lib/ftl/utils/ftl_bitmap.o 00:02:47.732 CC lib/ftl/utils/ftl_property.o 00:02:47.732 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:47.732 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:47.732 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:47.732 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:47.732 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:47.732 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:47.732 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:47.732 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:47.732 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:47.732 CC lib/ftl/base/ftl_base_bdev.o 00:02:47.732 CC lib/ftl/base/ftl_base_dev.o 00:02:47.732 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:47.732 CC lib/ftl/ftl_trace.o 00:02:47.732 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:48.313 LIB libspdk_nbd.a 00:02:48.313 SO libspdk_nbd.so.7.0 00:02:48.313 LIB libspdk_scsi.a 00:02:48.313 SYMLINK libspdk_nbd.so 00:02:48.313 SO libspdk_scsi.so.9.0 00:02:48.313 SYMLINK libspdk_scsi.so 00:02:48.572 LIB libspdk_ublk.a 00:02:48.572 SO libspdk_ublk.so.3.0 00:02:48.572 SYMLINK libspdk_ublk.so 00:02:48.572 LIB libspdk_ftl.a 00:02:48.832 CC lib/vhost/vhost_rpc.o 00:02:48.832 CC lib/vhost/vhost.o 00:02:48.832 CC lib/vhost/vhost_scsi.o 00:02:48.832 CC lib/vhost/vhost_blk.o 00:02:48.832 CC lib/vhost/rte_vhost_user.o 00:02:48.832 CC lib/iscsi/conn.o 00:02:48.832 CC lib/iscsi/init_grp.o 00:02:48.832 CC lib/iscsi/iscsi.o 00:02:48.832 CC lib/iscsi/md5.o 00:02:48.832 CC lib/iscsi/param.o 00:02:48.832 CC lib/iscsi/portal_grp.o 00:02:48.832 CC lib/iscsi/tgt_node.o 00:02:48.832 CC lib/iscsi/iscsi_subsystem.o 00:02:48.833 CC lib/iscsi/iscsi_rpc.o 00:02:48.833 CC lib/iscsi/task.o 00:02:48.833 SO libspdk_ftl.so.9.0 00:02:49.093 SYMLINK libspdk_ftl.so 00:02:49.666 LIB libspdk_nvmf.a 00:02:49.666 SO libspdk_nvmf.so.18.1 00:02:49.666 LIB libspdk_vhost.a 00:02:49.666 SO libspdk_vhost.so.8.0 00:02:49.666 SYMLINK libspdk_nvmf.so 00:02:49.928 LIB libspdk_iscsi.a 00:02:49.928 SO libspdk_iscsi.so.8.0 00:02:49.928 SYMLINK libspdk_vhost.so 00:02:50.189 SYMLINK libspdk_iscsi.so 00:02:50.762 CC module/env_dpdk/env_dpdk_rpc.o 00:02:50.762 CC module/accel/error/accel_error.o 00:02:50.762 CC module/accel/error/accel_error_rpc.o 00:02:50.762 LIB libspdk_env_dpdk_rpc.a 00:02:50.762 CC module/accel/ioat/accel_ioat.o 00:02:50.762 CC module/accel/ioat/accel_ioat_rpc.o 00:02:50.762 CC module/blob/bdev/blob_bdev.o 00:02:50.762 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:50.762 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:50.762 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:50.762 CC module/accel/iaa/accel_iaa.o 00:02:50.762 CC module/accel/iaa/accel_iaa_rpc.o 00:02:50.762 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:50.762 CC module/keyring/linux/keyring.o 00:02:50.762 CC module/accel/dsa/accel_dsa.o 00:02:50.762 CC module/keyring/file/keyring.o 00:02:50.762 CC module/keyring/linux/keyring_rpc.o 00:02:50.762 CC module/accel/dsa/accel_dsa_rpc.o 00:02:50.762 CC module/keyring/file/keyring_rpc.o 00:02:50.762 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:50.762 CC module/scheduler/gscheduler/gscheduler.o 00:02:50.762 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:50.762 CC module/sock/posix/posix.o 00:02:51.024 SO libspdk_env_dpdk_rpc.so.6.0 00:02:51.024 SYMLINK libspdk_env_dpdk_rpc.so 00:02:51.024 LIB libspdk_scheduler_gscheduler.a 00:02:51.024 LIB libspdk_keyring_linux.a 00:02:51.024 LIB libspdk_scheduler_dpdk_governor.a 00:02:51.024 LIB libspdk_keyring_file.a 00:02:51.024 LIB libspdk_accel_error.a 00:02:51.024 LIB libspdk_accel_ioat.a 00:02:51.024 SO libspdk_keyring_file.so.1.0 00:02:51.024 SO libspdk_scheduler_gscheduler.so.4.0 00:02:51.024 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:51.024 LIB libspdk_scheduler_dynamic.a 00:02:51.024 SO libspdk_keyring_linux.so.1.0 00:02:51.024 LIB libspdk_accel_iaa.a 00:02:51.024 SO libspdk_accel_error.so.2.0 00:02:51.024 SO libspdk_accel_ioat.so.6.0 00:02:51.024 SO libspdk_scheduler_dynamic.so.4.0 00:02:51.285 LIB libspdk_blob_bdev.a 00:02:51.285 SYMLINK libspdk_scheduler_gscheduler.so 00:02:51.285 SYMLINK libspdk_keyring_file.so 00:02:51.285 LIB libspdk_accel_dsa.a 00:02:51.285 SO libspdk_accel_iaa.so.3.0 00:02:51.285 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:51.285 SYMLINK libspdk_accel_ioat.so 00:02:51.285 SYMLINK libspdk_keyring_linux.so 00:02:51.285 SYMLINK libspdk_accel_error.so 00:02:51.285 SO libspdk_blob_bdev.so.11.0 00:02:51.285 SO libspdk_accel_dsa.so.5.0 00:02:51.285 SYMLINK libspdk_scheduler_dynamic.so 00:02:51.285 SYMLINK libspdk_accel_iaa.so 00:02:51.285 SYMLINK libspdk_blob_bdev.so 00:02:51.285 SYMLINK libspdk_accel_dsa.so 00:02:51.546 LIB libspdk_sock_posix.a 00:02:51.546 SO libspdk_sock_posix.so.6.0 00:02:51.807 SYMLINK libspdk_sock_posix.so 00:02:51.807 LIB libspdk_accel_dpdk_compressdev.a 00:02:51.807 CC module/bdev/error/vbdev_error.o 00:02:51.807 CC module/bdev/error/vbdev_error_rpc.o 00:02:51.807 CC module/bdev/gpt/gpt.o 00:02:51.807 CC module/bdev/gpt/vbdev_gpt.o 00:02:51.807 CC module/bdev/aio/bdev_aio_rpc.o 00:02:51.807 CC module/bdev/aio/bdev_aio.o 00:02:51.807 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:51.807 CC module/bdev/compress/vbdev_compress.o 00:02:51.807 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:51.807 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:51.807 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:51.807 CC module/bdev/null/bdev_null.o 00:02:51.807 CC module/bdev/lvol/vbdev_lvol.o 00:02:51.807 CC module/bdev/null/bdev_null_rpc.o 00:02:51.807 CC module/bdev/delay/vbdev_delay.o 00:02:51.807 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:51.807 CC module/bdev/nvme/bdev_nvme.o 00:02:51.807 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:51.807 CC module/bdev/nvme/nvme_rpc.o 00:02:51.807 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:51.807 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:51.807 CC module/bdev/passthru/vbdev_passthru.o 00:02:51.807 CC module/bdev/nvme/bdev_mdns_client.o 00:02:51.807 CC module/bdev/crypto/vbdev_crypto.o 00:02:51.807 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:51.807 CC module/bdev/nvme/vbdev_opal.o 00:02:51.807 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:51.807 CC module/bdev/split/vbdev_split.o 00:02:51.807 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:51.807 CC module/bdev/split/vbdev_split_rpc.o 00:02:51.807 CC module/bdev/malloc/bdev_malloc.o 00:02:51.807 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:51.807 CC module/bdev/raid/bdev_raid.o 00:02:51.807 CC module/bdev/raid/bdev_raid_rpc.o 00:02:51.807 CC module/bdev/raid/raid0.o 00:02:51.807 CC module/bdev/raid/bdev_raid_sb.o 00:02:51.807 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:51.807 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:51.807 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:51.807 CC module/bdev/raid/raid1.o 00:02:51.807 CC module/bdev/raid/concat.o 00:02:51.807 CC module/bdev/ftl/bdev_ftl.o 00:02:51.807 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:51.807 CC module/blobfs/bdev/blobfs_bdev.o 00:02:51.807 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:51.807 CC module/bdev/iscsi/bdev_iscsi.o 00:02:51.807 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:52.067 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:52.067 LIB libspdk_accel_dpdk_cryptodev.a 00:02:52.067 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:52.067 LIB libspdk_blobfs_bdev.a 00:02:52.067 LIB libspdk_bdev_gpt.a 00:02:52.067 SO libspdk_blobfs_bdev.so.6.0 00:02:52.067 LIB libspdk_bdev_null.a 00:02:52.067 LIB libspdk_bdev_split.a 00:02:52.067 SO libspdk_bdev_gpt.so.6.0 00:02:52.067 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:52.328 SO libspdk_bdev_null.so.6.0 00:02:52.328 LIB libspdk_bdev_ftl.a 00:02:52.328 SO libspdk_bdev_split.so.6.0 00:02:52.328 LIB libspdk_bdev_passthru.a 00:02:52.328 SYMLINK libspdk_blobfs_bdev.so 00:02:52.328 LIB libspdk_bdev_aio.a 00:02:52.328 LIB libspdk_bdev_error.a 00:02:52.328 SO libspdk_bdev_passthru.so.6.0 00:02:52.328 SO libspdk_bdev_ftl.so.6.0 00:02:52.328 SYMLINK libspdk_bdev_gpt.so 00:02:52.328 LIB libspdk_bdev_crypto.a 00:02:52.328 SYMLINK libspdk_bdev_split.so 00:02:52.328 SO libspdk_bdev_aio.so.6.0 00:02:52.328 SYMLINK libspdk_bdev_null.so 00:02:52.328 LIB libspdk_bdev_delay.a 00:02:52.328 SO libspdk_bdev_error.so.6.0 00:02:52.328 LIB libspdk_bdev_malloc.a 00:02:52.328 LIB libspdk_bdev_iscsi.a 00:02:52.328 SO libspdk_bdev_crypto.so.6.0 00:02:52.328 LIB libspdk_bdev_compress.a 00:02:52.328 SYMLINK libspdk_bdev_ftl.so 00:02:52.328 SO libspdk_bdev_delay.so.6.0 00:02:52.328 SO libspdk_bdev_malloc.so.6.0 00:02:52.328 SYMLINK libspdk_bdev_passthru.so 00:02:52.328 SO libspdk_bdev_iscsi.so.6.0 00:02:52.328 SYMLINK libspdk_bdev_aio.so 00:02:52.328 SO libspdk_bdev_compress.so.6.0 00:02:52.328 SYMLINK libspdk_bdev_error.so 00:02:52.328 LIB libspdk_bdev_virtio.a 00:02:52.328 SYMLINK libspdk_bdev_crypto.so 00:02:52.328 SYMLINK libspdk_bdev_iscsi.so 00:02:52.328 SYMLINK libspdk_bdev_delay.so 00:02:52.328 SO libspdk_bdev_virtio.so.6.0 00:02:52.328 SYMLINK libspdk_bdev_malloc.so 00:02:52.328 SYMLINK libspdk_bdev_compress.so 00:02:52.589 SYMLINK libspdk_bdev_virtio.so 00:02:52.589 LIB libspdk_bdev_zone_block.a 00:02:52.850 LIB libspdk_bdev_raid.a 00:02:52.850 SO libspdk_bdev_zone_block.so.6.0 00:02:52.850 SO libspdk_bdev_raid.so.6.0 00:02:52.850 SYMLINK libspdk_bdev_zone_block.so 00:02:52.850 SYMLINK libspdk_bdev_raid.so 00:02:53.111 LIB libspdk_bdev_lvol.a 00:02:53.111 SO libspdk_bdev_lvol.so.6.0 00:02:53.111 SYMLINK libspdk_bdev_lvol.so 00:02:53.681 LIB libspdk_bdev_nvme.a 00:02:53.681 SO libspdk_bdev_nvme.so.7.0 00:02:53.941 SYMLINK libspdk_bdev_nvme.so 00:02:54.512 CC module/event/subsystems/iobuf/iobuf.o 00:02:54.512 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:54.512 CC module/event/subsystems/vmd/vmd.o 00:02:54.512 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:54.512 CC module/event/subsystems/keyring/keyring.o 00:02:54.512 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:54.512 CC module/event/subsystems/scheduler/scheduler.o 00:02:54.512 CC module/event/subsystems/sock/sock.o 00:02:54.772 LIB libspdk_event_keyring.a 00:02:54.772 LIB libspdk_event_vhost_blk.a 00:02:54.772 LIB libspdk_event_vmd.a 00:02:54.772 LIB libspdk_event_scheduler.a 00:02:54.772 LIB libspdk_event_sock.a 00:02:54.772 SO libspdk_event_keyring.so.1.0 00:02:54.772 SO libspdk_event_vmd.so.6.0 00:02:54.772 SO libspdk_event_vhost_blk.so.3.0 00:02:54.772 SO libspdk_event_scheduler.so.4.0 00:02:54.772 SO libspdk_event_sock.so.5.0 00:02:54.772 SYMLINK libspdk_event_vhost_blk.so 00:02:54.772 SYMLINK libspdk_event_keyring.so 00:02:54.772 SYMLINK libspdk_event_scheduler.so 00:02:54.772 SYMLINK libspdk_event_sock.so 00:02:55.033 SYMLINK libspdk_event_vmd.so 00:02:55.033 LIB libspdk_event_iobuf.a 00:02:55.033 SO libspdk_event_iobuf.so.3.0 00:02:55.033 SYMLINK libspdk_event_iobuf.so 00:02:55.294 CC module/event/subsystems/accel/accel.o 00:02:55.556 LIB libspdk_event_accel.a 00:02:55.556 SO libspdk_event_accel.so.6.0 00:02:55.556 SYMLINK libspdk_event_accel.so 00:02:56.128 CC module/event/subsystems/bdev/bdev.o 00:02:56.128 LIB libspdk_event_bdev.a 00:02:56.128 SO libspdk_event_bdev.so.6.0 00:02:56.389 SYMLINK libspdk_event_bdev.so 00:02:56.650 CC module/event/subsystems/scsi/scsi.o 00:02:56.650 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:56.650 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:56.650 CC module/event/subsystems/ublk/ublk.o 00:02:56.650 CC module/event/subsystems/nbd/nbd.o 00:02:56.911 LIB libspdk_event_ublk.a 00:02:56.911 LIB libspdk_event_nbd.a 00:02:56.911 LIB libspdk_event_scsi.a 00:02:56.911 SO libspdk_event_nbd.so.6.0 00:02:56.911 SO libspdk_event_scsi.so.6.0 00:02:56.911 SO libspdk_event_ublk.so.3.0 00:02:56.911 LIB libspdk_event_nvmf.a 00:02:56.911 SYMLINK libspdk_event_nbd.so 00:02:56.911 SYMLINK libspdk_event_scsi.so 00:02:56.911 SYMLINK libspdk_event_ublk.so 00:02:56.911 SO libspdk_event_nvmf.so.6.0 00:02:57.172 SYMLINK libspdk_event_nvmf.so 00:02:57.432 CC module/event/subsystems/iscsi/iscsi.o 00:02:57.432 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:57.432 LIB libspdk_event_vhost_scsi.a 00:02:57.432 LIB libspdk_event_iscsi.a 00:02:57.432 SO libspdk_event_iscsi.so.6.0 00:02:57.432 SO libspdk_event_vhost_scsi.so.3.0 00:02:57.692 SYMLINK libspdk_event_iscsi.so 00:02:57.692 SYMLINK libspdk_event_vhost_scsi.so 00:02:57.692 SO libspdk.so.6.0 00:02:57.692 SYMLINK libspdk.so 00:02:58.263 CC app/spdk_nvme_identify/identify.o 00:02:58.263 TEST_HEADER include/spdk/accel.h 00:02:58.263 CXX app/trace/trace.o 00:02:58.263 TEST_HEADER include/spdk/accel_module.h 00:02:58.263 CC app/spdk_lspci/spdk_lspci.o 00:02:58.263 TEST_HEADER include/spdk/assert.h 00:02:58.263 TEST_HEADER include/spdk/barrier.h 00:02:58.263 TEST_HEADER include/spdk/base64.h 00:02:58.263 CC app/trace_record/trace_record.o 00:02:58.263 TEST_HEADER include/spdk/bdev.h 00:02:58.263 CC test/rpc_client/rpc_client_test.o 00:02:58.263 TEST_HEADER include/spdk/bdev_module.h 00:02:58.263 TEST_HEADER include/spdk/bdev_zone.h 00:02:58.263 TEST_HEADER include/spdk/bit_array.h 00:02:58.263 TEST_HEADER include/spdk/bit_pool.h 00:02:58.263 TEST_HEADER include/spdk/blob_bdev.h 00:02:58.263 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:58.263 CC app/spdk_nvme_discover/discovery_aer.o 00:02:58.263 TEST_HEADER include/spdk/blobfs.h 00:02:58.263 CC app/spdk_nvme_perf/perf.o 00:02:58.263 TEST_HEADER include/spdk/blob.h 00:02:58.263 CC app/spdk_top/spdk_top.o 00:02:58.263 TEST_HEADER include/spdk/conf.h 00:02:58.263 TEST_HEADER include/spdk/config.h 00:02:58.263 TEST_HEADER include/spdk/cpuset.h 00:02:58.263 TEST_HEADER include/spdk/crc16.h 00:02:58.263 TEST_HEADER include/spdk/crc32.h 00:02:58.263 TEST_HEADER include/spdk/crc64.h 00:02:58.263 TEST_HEADER include/spdk/dif.h 00:02:58.263 TEST_HEADER include/spdk/dma.h 00:02:58.263 TEST_HEADER include/spdk/endian.h 00:02:58.263 TEST_HEADER include/spdk/env_dpdk.h 00:02:58.263 TEST_HEADER include/spdk/env.h 00:02:58.263 TEST_HEADER include/spdk/event.h 00:02:58.263 TEST_HEADER include/spdk/fd.h 00:02:58.263 TEST_HEADER include/spdk/fd_group.h 00:02:58.263 TEST_HEADER include/spdk/ftl.h 00:02:58.263 TEST_HEADER include/spdk/file.h 00:02:58.263 TEST_HEADER include/spdk/gpt_spec.h 00:02:58.263 TEST_HEADER include/spdk/hexlify.h 00:02:58.263 TEST_HEADER include/spdk/histogram_data.h 00:02:58.263 TEST_HEADER include/spdk/idxd.h 00:02:58.263 TEST_HEADER include/spdk/idxd_spec.h 00:02:58.263 TEST_HEADER include/spdk/init.h 00:02:58.263 CC app/nvmf_tgt/nvmf_main.o 00:02:58.263 TEST_HEADER include/spdk/ioat.h 00:02:58.263 TEST_HEADER include/spdk/ioat_spec.h 00:02:58.263 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:58.264 TEST_HEADER include/spdk/iscsi_spec.h 00:02:58.264 TEST_HEADER include/spdk/json.h 00:02:58.264 TEST_HEADER include/spdk/keyring.h 00:02:58.264 TEST_HEADER include/spdk/jsonrpc.h 00:02:58.264 CC app/iscsi_tgt/iscsi_tgt.o 00:02:58.264 TEST_HEADER include/spdk/keyring_module.h 00:02:58.264 TEST_HEADER include/spdk/likely.h 00:02:58.264 TEST_HEADER include/spdk/log.h 00:02:58.264 TEST_HEADER include/spdk/lvol.h 00:02:58.264 TEST_HEADER include/spdk/memory.h 00:02:58.264 TEST_HEADER include/spdk/nbd.h 00:02:58.264 TEST_HEADER include/spdk/mmio.h 00:02:58.264 TEST_HEADER include/spdk/notify.h 00:02:58.264 TEST_HEADER include/spdk/nvme.h 00:02:58.264 TEST_HEADER include/spdk/nvme_intel.h 00:02:58.264 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:58.264 CC app/spdk_dd/spdk_dd.o 00:02:58.264 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:58.264 TEST_HEADER include/spdk/nvme_spec.h 00:02:58.264 TEST_HEADER include/spdk/nvme_zns.h 00:02:58.264 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:58.264 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:58.264 TEST_HEADER include/spdk/nvmf_spec.h 00:02:58.264 TEST_HEADER include/spdk/nvmf.h 00:02:58.264 TEST_HEADER include/spdk/opal.h 00:02:58.264 TEST_HEADER include/spdk/nvmf_transport.h 00:02:58.264 TEST_HEADER include/spdk/opal_spec.h 00:02:58.264 CC app/spdk_tgt/spdk_tgt.o 00:02:58.264 TEST_HEADER include/spdk/pci_ids.h 00:02:58.264 TEST_HEADER include/spdk/pipe.h 00:02:58.264 TEST_HEADER include/spdk/queue.h 00:02:58.264 TEST_HEADER include/spdk/reduce.h 00:02:58.264 TEST_HEADER include/spdk/scheduler.h 00:02:58.264 TEST_HEADER include/spdk/rpc.h 00:02:58.264 TEST_HEADER include/spdk/scsi.h 00:02:58.264 TEST_HEADER include/spdk/scsi_spec.h 00:02:58.264 TEST_HEADER include/spdk/stdinc.h 00:02:58.264 TEST_HEADER include/spdk/sock.h 00:02:58.264 TEST_HEADER include/spdk/string.h 00:02:58.264 TEST_HEADER include/spdk/thread.h 00:02:58.264 TEST_HEADER include/spdk/trace_parser.h 00:02:58.264 TEST_HEADER include/spdk/trace.h 00:02:58.264 TEST_HEADER include/spdk/tree.h 00:02:58.264 TEST_HEADER include/spdk/ublk.h 00:02:58.264 TEST_HEADER include/spdk/util.h 00:02:58.264 TEST_HEADER include/spdk/uuid.h 00:02:58.264 TEST_HEADER include/spdk/version.h 00:02:58.264 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:58.264 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:58.264 TEST_HEADER include/spdk/vhost.h 00:02:58.264 TEST_HEADER include/spdk/vmd.h 00:02:58.264 TEST_HEADER include/spdk/xor.h 00:02:58.264 TEST_HEADER include/spdk/zipf.h 00:02:58.264 CXX test/cpp_headers/accel.o 00:02:58.264 CXX test/cpp_headers/accel_module.o 00:02:58.264 CXX test/cpp_headers/assert.o 00:02:58.264 CXX test/cpp_headers/barrier.o 00:02:58.264 CXX test/cpp_headers/base64.o 00:02:58.264 CXX test/cpp_headers/bdev_module.o 00:02:58.264 CXX test/cpp_headers/bdev.o 00:02:58.264 CXX test/cpp_headers/bit_array.o 00:02:58.264 CXX test/cpp_headers/bdev_zone.o 00:02:58.264 CXX test/cpp_headers/bit_pool.o 00:02:58.264 CXX test/cpp_headers/blob_bdev.o 00:02:58.264 CXX test/cpp_headers/blobfs.o 00:02:58.264 CXX test/cpp_headers/blobfs_bdev.o 00:02:58.264 CXX test/cpp_headers/blob.o 00:02:58.264 CXX test/cpp_headers/conf.o 00:02:58.264 CXX test/cpp_headers/config.o 00:02:58.264 CXX test/cpp_headers/cpuset.o 00:02:58.264 CXX test/cpp_headers/crc16.o 00:02:58.264 CXX test/cpp_headers/crc64.o 00:02:58.264 CXX test/cpp_headers/crc32.o 00:02:58.264 CXX test/cpp_headers/dif.o 00:02:58.264 CXX test/cpp_headers/dma.o 00:02:58.264 CXX test/cpp_headers/endian.o 00:02:58.264 CXX test/cpp_headers/env_dpdk.o 00:02:58.264 CXX test/cpp_headers/event.o 00:02:58.264 CXX test/cpp_headers/env.o 00:02:58.264 CXX test/cpp_headers/fd_group.o 00:02:58.264 CXX test/cpp_headers/fd.o 00:02:58.264 CXX test/cpp_headers/file.o 00:02:58.264 CXX test/cpp_headers/ftl.o 00:02:58.264 CXX test/cpp_headers/hexlify.o 00:02:58.264 CXX test/cpp_headers/gpt_spec.o 00:02:58.264 CXX test/cpp_headers/histogram_data.o 00:02:58.528 CXX test/cpp_headers/idxd.o 00:02:58.528 CXX test/cpp_headers/idxd_spec.o 00:02:58.528 CXX test/cpp_headers/ioat.o 00:02:58.528 CXX test/cpp_headers/ioat_spec.o 00:02:58.528 CXX test/cpp_headers/init.o 00:02:58.528 CXX test/cpp_headers/iscsi_spec.o 00:02:58.528 CXX test/cpp_headers/json.o 00:02:58.528 CXX test/cpp_headers/jsonrpc.o 00:02:58.528 CXX test/cpp_headers/keyring.o 00:02:58.528 CXX test/cpp_headers/likely.o 00:02:58.528 CXX test/cpp_headers/log.o 00:02:58.528 CXX test/cpp_headers/keyring_module.o 00:02:58.528 CXX test/cpp_headers/memory.o 00:02:58.528 CXX test/cpp_headers/nbd.o 00:02:58.528 CXX test/cpp_headers/mmio.o 00:02:58.528 CXX test/cpp_headers/lvol.o 00:02:58.528 CXX test/cpp_headers/nvme_intel.o 00:02:58.528 CXX test/cpp_headers/nvme.o 00:02:58.528 CXX test/cpp_headers/notify.o 00:02:58.528 CXX test/cpp_headers/nvme_ocssd.o 00:02:58.528 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:58.528 CXX test/cpp_headers/nvmf_cmd.o 00:02:58.528 CXX test/cpp_headers/nvme_spec.o 00:02:58.528 CXX test/cpp_headers/nvme_zns.o 00:02:58.528 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:58.528 CXX test/cpp_headers/nvmf.o 00:02:58.528 CXX test/cpp_headers/opal.o 00:02:58.528 CXX test/cpp_headers/nvmf_spec.o 00:02:58.528 CXX test/cpp_headers/pci_ids.o 00:02:58.528 CXX test/cpp_headers/pipe.o 00:02:58.528 CXX test/cpp_headers/nvmf_transport.o 00:02:58.528 CXX test/cpp_headers/opal_spec.o 00:02:58.528 CXX test/cpp_headers/queue.o 00:02:58.528 LINK spdk_lspci 00:02:58.528 CXX test/cpp_headers/scheduler.o 00:02:58.528 CXX test/cpp_headers/reduce.o 00:02:58.528 CXX test/cpp_headers/rpc.o 00:02:58.528 CC examples/ioat/verify/verify.o 00:02:58.528 CXX test/cpp_headers/sock.o 00:02:58.528 CC examples/util/zipf/zipf.o 00:02:58.528 CXX test/cpp_headers/stdinc.o 00:02:58.528 CXX test/cpp_headers/scsi.o 00:02:58.528 CXX test/cpp_headers/scsi_spec.o 00:02:58.528 CXX test/cpp_headers/string.o 00:02:58.528 CXX test/cpp_headers/thread.o 00:02:58.528 CXX test/cpp_headers/trace_parser.o 00:02:58.528 CXX test/cpp_headers/trace.o 00:02:58.528 CC examples/ioat/perf/perf.o 00:02:58.528 CXX test/cpp_headers/tree.o 00:02:58.528 CXX test/cpp_headers/uuid.o 00:02:58.528 CXX test/cpp_headers/ublk.o 00:02:58.528 CXX test/cpp_headers/vfio_user_pci.o 00:02:58.528 CXX test/cpp_headers/version.o 00:02:58.528 CXX test/cpp_headers/util.o 00:02:58.528 CXX test/cpp_headers/vfio_user_spec.o 00:02:58.528 CC test/thread/poller_perf/poller_perf.o 00:02:58.528 CXX test/cpp_headers/vhost.o 00:02:58.528 CXX test/cpp_headers/xor.o 00:02:58.528 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:58.528 CXX test/cpp_headers/vmd.o 00:02:58.528 CC test/app/jsoncat/jsoncat.o 00:02:58.528 CC test/env/pci/pci_ut.o 00:02:58.528 CXX test/cpp_headers/zipf.o 00:02:58.528 CC test/app/histogram_perf/histogram_perf.o 00:02:58.528 CC test/app/stub/stub.o 00:02:58.528 CC test/env/vtophys/vtophys.o 00:02:58.528 CC test/dma/test_dma/test_dma.o 00:02:58.528 CC app/fio/nvme/fio_plugin.o 00:02:58.528 CC test/env/memory/memory_ut.o 00:02:58.528 LINK rpc_client_test 00:02:58.793 LINK spdk_nvme_discover 00:02:58.793 CC app/fio/bdev/fio_plugin.o 00:02:58.793 LINK interrupt_tgt 00:02:58.793 CC test/app/bdev_svc/bdev_svc.o 00:02:58.793 LINK iscsi_tgt 00:02:59.054 LINK spdk_trace_record 00:02:59.054 LINK spdk_trace 00:02:59.054 LINK spdk_tgt 00:02:59.054 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:59.054 CC test/env/mem_callbacks/mem_callbacks.o 00:02:59.054 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:59.054 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:59.054 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:59.313 LINK nvmf_tgt 00:02:59.313 LINK zipf 00:02:59.313 LINK poller_perf 00:02:59.313 LINK histogram_perf 00:02:59.313 LINK spdk_dd 00:02:59.313 LINK jsoncat 00:02:59.573 LINK env_dpdk_post_init 00:02:59.573 LINK bdev_svc 00:02:59.573 LINK stub 00:02:59.573 LINK vtophys 00:02:59.573 LINK verify 00:02:59.573 LINK ioat_perf 00:02:59.833 CC app/vhost/vhost.o 00:02:59.833 LINK test_dma 00:02:59.833 LINK nvme_fuzz 00:02:59.833 LINK pci_ut 00:02:59.833 LINK spdk_bdev 00:02:59.833 LINK spdk_nvme_perf 00:02:59.833 LINK vhost_fuzz 00:02:59.833 LINK spdk_nvme 00:02:59.833 CC test/event/event_perf/event_perf.o 00:02:59.833 CC test/event/reactor/reactor.o 00:03:00.093 CC examples/idxd/perf/perf.o 00:03:00.093 LINK vhost 00:03:00.093 CC test/event/reactor_perf/reactor_perf.o 00:03:00.093 CC examples/sock/hello_world/hello_sock.o 00:03:00.093 LINK spdk_top 00:03:00.093 CC examples/vmd/lsvmd/lsvmd.o 00:03:00.093 CC test/event/app_repeat/app_repeat.o 00:03:00.093 LINK spdk_nvme_identify 00:03:00.093 CC examples/vmd/led/led.o 00:03:00.093 CC test/event/scheduler/scheduler.o 00:03:00.093 CC examples/thread/thread/thread_ex.o 00:03:00.093 LINK mem_callbacks 00:03:00.093 LINK reactor 00:03:00.093 LINK event_perf 00:03:00.093 LINK reactor_perf 00:03:00.093 LINK lsvmd 00:03:00.093 LINK led 00:03:00.093 LINK app_repeat 00:03:00.354 LINK hello_sock 00:03:00.354 LINK scheduler 00:03:00.354 LINK idxd_perf 00:03:00.354 LINK thread 00:03:00.354 LINK memory_ut 00:03:00.354 CC test/nvme/boot_partition/boot_partition.o 00:03:00.354 CC test/nvme/reset/reset.o 00:03:00.354 CC test/nvme/e2edp/nvme_dp.o 00:03:00.354 CC test/nvme/simple_copy/simple_copy.o 00:03:00.354 CC test/nvme/overhead/overhead.o 00:03:00.354 CC test/nvme/aer/aer.o 00:03:00.354 CC test/nvme/connect_stress/connect_stress.o 00:03:00.354 CC test/nvme/reserve/reserve.o 00:03:00.354 CC test/nvme/startup/startup.o 00:03:00.354 CC test/nvme/err_injection/err_injection.o 00:03:00.354 CC test/nvme/sgl/sgl.o 00:03:00.354 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:00.354 CC test/nvme/cuse/cuse.o 00:03:00.354 CC test/nvme/compliance/nvme_compliance.o 00:03:00.354 CC test/nvme/fused_ordering/fused_ordering.o 00:03:00.354 CC test/nvme/fdp/fdp.o 00:03:00.354 CC test/blobfs/mkfs/mkfs.o 00:03:00.354 CC test/accel/dif/dif.o 00:03:00.614 CC test/lvol/esnap/esnap.o 00:03:00.614 LINK boot_partition 00:03:00.614 LINK connect_stress 00:03:00.614 LINK startup 00:03:00.614 LINK doorbell_aers 00:03:00.614 LINK reserve 00:03:00.614 LINK err_injection 00:03:00.614 LINK fused_ordering 00:03:00.614 LINK reset 00:03:00.614 LINK simple_copy 00:03:00.614 LINK nvme_dp 00:03:00.614 LINK sgl 00:03:00.614 LINK mkfs 00:03:00.614 LINK aer 00:03:00.614 LINK overhead 00:03:00.614 LINK fdp 00:03:00.874 LINK nvme_compliance 00:03:00.874 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:00.874 CC examples/nvme/hello_world/hello_world.o 00:03:00.874 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:00.874 CC examples/nvme/arbitration/arbitration.o 00:03:00.874 CC examples/nvme/reconnect/reconnect.o 00:03:00.874 CC examples/nvme/hotplug/hotplug.o 00:03:00.874 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:00.874 CC examples/nvme/abort/abort.o 00:03:00.874 LINK dif 00:03:00.874 CC examples/accel/perf/accel_perf.o 00:03:00.874 CC examples/blob/hello_world/hello_blob.o 00:03:00.874 CC examples/blob/cli/blobcli.o 00:03:00.874 LINK cmb_copy 00:03:01.133 LINK pmr_persistence 00:03:01.133 LINK iscsi_fuzz 00:03:01.133 LINK hello_world 00:03:01.133 LINK hotplug 00:03:01.133 LINK reconnect 00:03:01.133 LINK arbitration 00:03:01.133 LINK abort 00:03:01.133 LINK nvme_manage 00:03:01.133 LINK hello_blob 00:03:01.394 LINK accel_perf 00:03:01.394 LINK blobcli 00:03:01.394 CC test/bdev/bdevio/bdevio.o 00:03:01.394 LINK cuse 00:03:01.965 LINK bdevio 00:03:01.965 CC examples/bdev/hello_world/hello_bdev.o 00:03:01.965 CC examples/bdev/bdevperf/bdevperf.o 00:03:02.227 LINK hello_bdev 00:03:02.798 LINK bdevperf 00:03:03.370 CC examples/nvmf/nvmf/nvmf.o 00:03:03.630 LINK nvmf 00:03:04.573 LINK esnap 00:03:04.834 00:03:04.834 real 1m31.738s 00:03:04.834 user 14m17.880s 00:03:04.834 sys 6m55.907s 00:03:04.834 07:38:49 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:04.834 07:38:49 make -- common/autotest_common.sh@10 -- $ set +x 00:03:04.834 ************************************ 00:03:04.834 END TEST make 00:03:04.834 ************************************ 00:03:05.096 07:38:49 -- common/autotest_common.sh@1142 -- $ return 0 00:03:05.096 07:38:49 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:05.096 07:38:49 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:05.096 07:38:49 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:05.096 07:38:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.096 07:38:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:05.096 07:38:49 -- pm/common@44 -- $ pid=1407518 00:03:05.096 07:38:49 -- pm/common@50 -- $ kill -TERM 1407518 00:03:05.096 07:38:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.096 07:38:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:05.096 07:38:49 -- pm/common@44 -- $ pid=1407519 00:03:05.096 07:38:49 -- pm/common@50 -- $ kill -TERM 1407519 00:03:05.096 07:38:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.096 07:38:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:05.096 07:38:49 -- pm/common@44 -- $ pid=1407522 00:03:05.096 07:38:49 -- pm/common@50 -- $ kill -TERM 1407522 00:03:05.096 07:38:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.096 07:38:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:05.096 07:38:49 -- pm/common@44 -- $ pid=1407545 00:03:05.096 07:38:49 -- pm/common@50 -- $ sudo -E kill -TERM 1407545 00:03:05.096 07:38:49 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:05.096 07:38:49 -- nvmf/common.sh@7 -- # uname -s 00:03:05.096 07:38:49 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:05.096 07:38:49 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:05.096 07:38:49 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:05.096 07:38:49 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:05.096 07:38:49 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:05.096 07:38:49 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:05.096 07:38:49 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:05.096 07:38:49 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:05.096 07:38:49 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:05.096 07:38:49 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:05.096 07:38:49 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:03:05.096 07:38:49 -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:03:05.096 07:38:49 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:05.096 07:38:49 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:05.096 07:38:49 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:05.096 07:38:49 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:05.096 07:38:49 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:05.096 07:38:49 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:05.096 07:38:49 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:05.096 07:38:49 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:05.096 07:38:49 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:05.096 07:38:49 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:05.096 07:38:49 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:05.096 07:38:49 -- paths/export.sh@5 -- # export PATH 00:03:05.096 07:38:49 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:05.096 07:38:49 -- nvmf/common.sh@47 -- # : 0 00:03:05.096 07:38:49 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:05.096 07:38:49 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:05.096 07:38:49 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:05.096 07:38:49 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:05.096 07:38:49 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:05.096 07:38:49 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:05.096 07:38:49 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:05.096 07:38:49 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:05.096 07:38:49 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:05.096 07:38:49 -- spdk/autotest.sh@32 -- # uname -s 00:03:05.096 07:38:49 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:05.096 07:38:49 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:05.096 07:38:49 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:05.096 07:38:49 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:05.096 07:38:49 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:05.096 07:38:49 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:05.096 07:38:49 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:05.096 07:38:49 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:05.096 07:38:49 -- spdk/autotest.sh@48 -- # udevadm_pid=1478040 00:03:05.096 07:38:49 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:05.096 07:38:49 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:05.096 07:38:49 -- pm/common@17 -- # local monitor 00:03:05.096 07:38:49 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.096 07:38:49 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.096 07:38:49 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.096 07:38:49 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:05.096 07:38:49 -- pm/common@21 -- # date +%s 00:03:05.096 07:38:49 -- pm/common@25 -- # sleep 1 00:03:05.096 07:38:49 -- pm/common@21 -- # date +%s 00:03:05.096 07:38:49 -- pm/common@21 -- # date +%s 00:03:05.096 07:38:49 -- pm/common@21 -- # date +%s 00:03:05.096 07:38:49 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721021929 00:03:05.096 07:38:49 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721021929 00:03:05.096 07:38:49 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721021929 00:03:05.096 07:38:49 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721021929 00:03:05.400 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721021929_collect-vmstat.pm.log 00:03:05.400 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721021929_collect-cpu-load.pm.log 00:03:05.400 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721021929_collect-cpu-temp.pm.log 00:03:05.400 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721021929_collect-bmc-pm.bmc.pm.log 00:03:06.362 07:38:50 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:06.362 07:38:50 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:06.362 07:38:50 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:06.362 07:38:50 -- common/autotest_common.sh@10 -- # set +x 00:03:06.362 07:38:50 -- spdk/autotest.sh@59 -- # create_test_list 00:03:06.362 07:38:50 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:06.362 07:38:50 -- common/autotest_common.sh@10 -- # set +x 00:03:06.362 07:38:50 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:06.362 07:38:50 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:06.362 07:38:50 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:06.362 07:38:50 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:06.362 07:38:50 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:06.362 07:38:50 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:06.362 07:38:50 -- common/autotest_common.sh@1455 -- # uname 00:03:06.362 07:38:50 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:06.362 07:38:50 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:06.362 07:38:50 -- common/autotest_common.sh@1475 -- # uname 00:03:06.362 07:38:50 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:06.362 07:38:50 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:06.362 07:38:50 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:06.362 07:38:50 -- spdk/autotest.sh@72 -- # hash lcov 00:03:06.362 07:38:50 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:06.362 07:38:50 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:06.362 --rc lcov_branch_coverage=1 00:03:06.362 --rc lcov_function_coverage=1 00:03:06.362 --rc genhtml_branch_coverage=1 00:03:06.362 --rc genhtml_function_coverage=1 00:03:06.362 --rc genhtml_legend=1 00:03:06.362 --rc geninfo_all_blocks=1 00:03:06.362 ' 00:03:06.362 07:38:50 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:06.362 --rc lcov_branch_coverage=1 00:03:06.362 --rc lcov_function_coverage=1 00:03:06.362 --rc genhtml_branch_coverage=1 00:03:06.362 --rc genhtml_function_coverage=1 00:03:06.362 --rc genhtml_legend=1 00:03:06.362 --rc geninfo_all_blocks=1 00:03:06.362 ' 00:03:06.362 07:38:50 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:06.362 --rc lcov_branch_coverage=1 00:03:06.362 --rc lcov_function_coverage=1 00:03:06.362 --rc genhtml_branch_coverage=1 00:03:06.362 --rc genhtml_function_coverage=1 00:03:06.362 --rc genhtml_legend=1 00:03:06.362 --rc geninfo_all_blocks=1 00:03:06.362 --no-external' 00:03:06.362 07:38:50 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:06.362 --rc lcov_branch_coverage=1 00:03:06.362 --rc lcov_function_coverage=1 00:03:06.362 --rc genhtml_branch_coverage=1 00:03:06.362 --rc genhtml_function_coverage=1 00:03:06.362 --rc genhtml_legend=1 00:03:06.362 --rc geninfo_all_blocks=1 00:03:06.362 --no-external' 00:03:06.362 07:38:50 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:06.362 lcov: LCOV version 1.14 00:03:06.362 07:38:50 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:10.569 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:10.569 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:10.570 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:10.570 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:10.570 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:10.570 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:10.570 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:10.570 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:10.830 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:10.830 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:10.830 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:10.831 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:10.831 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:11.093 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:11.093 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:11.354 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:11.354 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:11.354 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:11.354 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:11.354 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:11.354 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:11.354 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:11.354 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:11.354 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:11.354 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:11.354 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:11.354 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:11.354 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:11.354 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:11.354 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:11.354 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:29.474 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:29.474 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:34.761 07:39:18 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:34.761 07:39:18 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:34.761 07:39:18 -- common/autotest_common.sh@10 -- # set +x 00:03:34.761 07:39:18 -- spdk/autotest.sh@91 -- # rm -f 00:03:34.761 07:39:18 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:38.061 0000:80:01.6 (8086 0b00): Already using the ioatdma driver 00:03:38.061 0000:80:01.7 (8086 0b00): Already using the ioatdma driver 00:03:38.061 0000:80:01.4 (8086 0b00): Already using the ioatdma driver 00:03:38.061 0000:80:01.5 (8086 0b00): Already using the ioatdma driver 00:03:38.061 0000:80:01.2 (8086 0b00): Already using the ioatdma driver 00:03:38.061 0000:80:01.3 (8086 0b00): Already using the ioatdma driver 00:03:38.061 0000:80:01.0 (8086 0b00): Already using the ioatdma driver 00:03:38.061 0000:80:01.1 (8086 0b00): Already using the ioatdma driver 00:03:38.061 0000:65:00.0 (8086 0a54): Already using the nvme driver 00:03:38.322 0000:00:01.6 (8086 0b00): Already using the ioatdma driver 00:03:38.322 0000:00:01.7 (8086 0b00): Already using the ioatdma driver 00:03:38.322 0000:00:01.4 (8086 0b00): Already using the ioatdma driver 00:03:38.322 0000:00:01.5 (8086 0b00): Already using the ioatdma driver 00:03:38.322 0000:00:01.2 (8086 0b00): Already using the ioatdma driver 00:03:38.322 0000:00:01.3 (8086 0b00): Already using the ioatdma driver 00:03:38.322 0000:00:01.0 (8086 0b00): Already using the ioatdma driver 00:03:38.322 0000:00:01.1 (8086 0b00): Already using the ioatdma driver 00:03:38.322 07:39:23 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:38.322 07:39:23 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:38.322 07:39:23 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:38.322 07:39:23 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:38.322 07:39:23 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:38.322 07:39:23 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:38.322 07:39:23 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:38.322 07:39:23 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:38.322 07:39:23 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:38.322 07:39:23 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:38.322 07:39:23 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:38.322 07:39:23 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:38.322 07:39:23 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:38.322 07:39:23 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:38.322 07:39:23 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:38.322 No valid GPT data, bailing 00:03:38.584 07:39:23 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:38.585 07:39:23 -- scripts/common.sh@391 -- # pt= 00:03:38.585 07:39:23 -- scripts/common.sh@392 -- # return 1 00:03:38.585 07:39:23 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:38.585 1+0 records in 00:03:38.585 1+0 records out 00:03:38.585 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00435143 s, 241 MB/s 00:03:38.585 07:39:23 -- spdk/autotest.sh@118 -- # sync 00:03:38.585 07:39:23 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:38.585 07:39:23 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:38.585 07:39:23 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:46.721 07:39:30 -- spdk/autotest.sh@124 -- # uname -s 00:03:46.721 07:39:30 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:46.721 07:39:30 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:46.721 07:39:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:46.721 07:39:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.721 07:39:30 -- common/autotest_common.sh@10 -- # set +x 00:03:46.721 ************************************ 00:03:46.721 START TEST setup.sh 00:03:46.721 ************************************ 00:03:46.721 07:39:30 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:46.721 * Looking for test storage... 00:03:46.721 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:46.721 07:39:30 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:46.721 07:39:30 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:46.721 07:39:30 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:46.721 07:39:30 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:46.721 07:39:30 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.721 07:39:30 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:46.721 ************************************ 00:03:46.721 START TEST acl 00:03:46.721 ************************************ 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:46.721 * Looking for test storage... 00:03:46.721 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:46.721 07:39:30 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:46.721 07:39:30 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:46.721 07:39:30 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:46.721 07:39:30 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:46.721 07:39:30 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:46.721 07:39:30 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:46.721 07:39:30 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:46.721 07:39:30 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:46.721 07:39:30 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:50.954 07:39:35 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:50.954 07:39:35 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:50.954 07:39:35 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:50.954 07:39:35 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:50.954 07:39:35 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.954 07:39:35 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:54.249 Hugepages 00:03:54.249 node hugesize free / total 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 00:03:54.249 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.0 == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.1 == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.2 == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.3 == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.4 == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.5 == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.6 == *:*:*.* ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.249 07:39:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.7 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:65:00.0 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.0 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.1 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.2 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.3 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.4 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.5 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.6 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.7 == *:*:*.* ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:54.509 07:39:39 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:54.509 07:39:39 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:54.509 07:39:39 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.509 07:39:39 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:54.510 ************************************ 00:03:54.510 START TEST denied 00:03:54.510 ************************************ 00:03:54.510 07:39:39 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:03:54.510 07:39:39 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:65:00.0' 00:03:54.510 07:39:39 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:54.510 07:39:39 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:65:00.0' 00:03:54.510 07:39:39 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:54.510 07:39:39 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:58.712 0000:65:00.0 (8086 0a54): Skipping denied controller at 0000:65:00.0 00:03:58.712 07:39:43 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:65:00.0 00:03:58.712 07:39:43 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:58.712 07:39:43 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:58.712 07:39:43 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:65:00.0 ]] 00:03:58.713 07:39:43 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:65:00.0/driver 00:03:58.713 07:39:43 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:58.713 07:39:43 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:58.713 07:39:43 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:58.713 07:39:43 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:58.713 07:39:43 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:04.000 00:04:04.000 real 0m9.321s 00:04:04.000 user 0m3.016s 00:04:04.000 sys 0m5.560s 00:04:04.000 07:39:48 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:04.000 07:39:48 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:04.000 ************************************ 00:04:04.000 END TEST denied 00:04:04.000 ************************************ 00:04:04.000 07:39:48 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:04.000 07:39:48 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:04.000 07:39:48 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:04.000 07:39:48 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:04.000 07:39:48 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:04.000 ************************************ 00:04:04.000 START TEST allowed 00:04:04.000 ************************************ 00:04:04.000 07:39:48 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:04.000 07:39:48 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:65:00.0 00:04:04.000 07:39:48 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:65:00.0 .*: nvme -> .*' 00:04:04.000 07:39:48 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:04.000 07:39:48 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.000 07:39:48 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:10.584 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:04:10.584 07:39:54 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:10.584 07:39:54 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:10.584 07:39:54 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:10.584 07:39:54 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:10.584 07:39:54 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:14.786 00:04:14.786 real 0m10.228s 00:04:14.786 user 0m2.998s 00:04:14.786 sys 0m5.446s 00:04:14.786 07:39:58 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:14.786 07:39:58 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:14.786 ************************************ 00:04:14.786 END TEST allowed 00:04:14.786 ************************************ 00:04:14.786 07:39:58 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:14.786 00:04:14.786 real 0m28.181s 00:04:14.786 user 0m9.358s 00:04:14.786 sys 0m16.532s 00:04:14.786 07:39:58 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:14.786 07:39:58 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:14.786 ************************************ 00:04:14.786 END TEST acl 00:04:14.786 ************************************ 00:04:14.786 07:39:58 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:14.786 07:39:58 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:14.786 07:39:58 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:14.786 07:39:58 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:14.786 07:39:58 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:14.786 ************************************ 00:04:14.786 START TEST hugepages 00:04:14.786 ************************************ 00:04:14.786 07:39:58 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:14.786 * Looking for test storage... 00:04:14.786 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.786 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 108340264 kB' 'MemAvailable: 111494660 kB' 'Buffers: 12464 kB' 'Cached: 9335764 kB' 'SwapCached: 0 kB' 'Active: 6424564 kB' 'Inactive: 3458012 kB' 'Active(anon): 6029952 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537552 kB' 'Mapped: 183328 kB' 'Shmem: 5495604 kB' 'KReclaimable: 252232 kB' 'Slab: 870872 kB' 'SReclaimable: 252232 kB' 'SUnreclaim: 618640 kB' 'KernelStack: 24912 kB' 'PageTables: 8352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69463468 kB' 'Committed_AS: 7566840 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226864 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.787 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:14.788 07:39:59 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:14.788 07:39:59 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:14.788 07:39:59 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:14.788 07:39:59 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:14.788 ************************************ 00:04:14.788 START TEST default_setup 00:04:14.788 ************************************ 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:14.788 07:39:59 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:18.993 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:04:18.993 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:20.374 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110522268 kB' 'MemAvailable: 113676668 kB' 'Buffers: 12464 kB' 'Cached: 9335908 kB' 'SwapCached: 0 kB' 'Active: 6444116 kB' 'Inactive: 3458012 kB' 'Active(anon): 6049504 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 557048 kB' 'Mapped: 183504 kB' 'Shmem: 5495748 kB' 'KReclaimable: 252240 kB' 'Slab: 869024 kB' 'SReclaimable: 252240 kB' 'SUnreclaim: 616784 kB' 'KernelStack: 24960 kB' 'PageTables: 8580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7594780 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226672 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.375 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.640 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.641 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110522808 kB' 'MemAvailable: 113677208 kB' 'Buffers: 12464 kB' 'Cached: 9335908 kB' 'SwapCached: 0 kB' 'Active: 6444488 kB' 'Inactive: 3458012 kB' 'Active(anon): 6049876 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 557480 kB' 'Mapped: 183444 kB' 'Shmem: 5495748 kB' 'KReclaimable: 252240 kB' 'Slab: 869024 kB' 'SReclaimable: 252240 kB' 'SUnreclaim: 616784 kB' 'KernelStack: 24976 kB' 'PageTables: 8632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7594796 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226640 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.642 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.643 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110523856 kB' 'MemAvailable: 113678256 kB' 'Buffers: 12464 kB' 'Cached: 9335912 kB' 'SwapCached: 0 kB' 'Active: 6443808 kB' 'Inactive: 3458012 kB' 'Active(anon): 6049196 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 556776 kB' 'Mapped: 183504 kB' 'Shmem: 5495752 kB' 'KReclaimable: 252240 kB' 'Slab: 869108 kB' 'SReclaimable: 252240 kB' 'SUnreclaim: 616868 kB' 'KernelStack: 24944 kB' 'PageTables: 8528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7594820 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226640 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.644 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.645 nr_hugepages=1024 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.645 resv_hugepages=0 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.645 surplus_hugepages=0 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.645 anon_hugepages=0 00:04:20.645 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110521244 kB' 'MemAvailable: 113675644 kB' 'Buffers: 12464 kB' 'Cached: 9335948 kB' 'SwapCached: 0 kB' 'Active: 6445564 kB' 'Inactive: 3458012 kB' 'Active(anon): 6050952 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558476 kB' 'Mapped: 184008 kB' 'Shmem: 5495788 kB' 'KReclaimable: 252240 kB' 'Slab: 869108 kB' 'SReclaimable: 252240 kB' 'SUnreclaim: 616868 kB' 'KernelStack: 24928 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7597384 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226640 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.646 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.647 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60965724 kB' 'MemUsed: 4696276 kB' 'SwapCached: 0 kB' 'Active: 1513016 kB' 'Inactive: 129584 kB' 'Active(anon): 1194424 kB' 'Inactive(anon): 0 kB' 'Active(file): 318592 kB' 'Inactive(file): 129584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1379012 kB' 'Mapped: 115760 kB' 'AnonPages: 266764 kB' 'Shmem: 930836 kB' 'KernelStack: 13976 kB' 'PageTables: 5212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132620 kB' 'Slab: 405372 kB' 'SReclaimable: 132620 kB' 'SUnreclaim: 272752 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.648 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:20.649 node0=1024 expecting 1024 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:20.649 00:04:20.649 real 0m6.126s 00:04:20.649 user 0m1.627s 00:04:20.649 sys 0m2.647s 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:20.649 07:40:05 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:20.649 ************************************ 00:04:20.649 END TEST default_setup 00:04:20.649 ************************************ 00:04:20.649 07:40:05 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:20.649 07:40:05 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:20.649 07:40:05 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:20.649 07:40:05 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:20.649 07:40:05 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:20.649 ************************************ 00:04:20.649 START TEST per_node_1G_alloc 00:04:20.649 ************************************ 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.649 07:40:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:24.901 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:24.901 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110515656 kB' 'MemAvailable: 113670056 kB' 'Buffers: 12464 kB' 'Cached: 9336080 kB' 'SwapCached: 0 kB' 'Active: 6442052 kB' 'Inactive: 3458012 kB' 'Active(anon): 6047440 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553780 kB' 'Mapped: 182712 kB' 'Shmem: 5495920 kB' 'KReclaimable: 252240 kB' 'Slab: 869428 kB' 'SReclaimable: 252240 kB' 'SUnreclaim: 617188 kB' 'KernelStack: 25040 kB' 'PageTables: 8648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7576268 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226880 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.901 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.902 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110515956 kB' 'MemAvailable: 113670356 kB' 'Buffers: 12464 kB' 'Cached: 9336080 kB' 'SwapCached: 0 kB' 'Active: 6441148 kB' 'Inactive: 3458012 kB' 'Active(anon): 6046536 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553804 kB' 'Mapped: 182624 kB' 'Shmem: 5495920 kB' 'KReclaimable: 252240 kB' 'Slab: 869376 kB' 'SReclaimable: 252240 kB' 'SUnreclaim: 617136 kB' 'KernelStack: 25040 kB' 'PageTables: 8352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7574668 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226800 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.903 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110516572 kB' 'MemAvailable: 113670972 kB' 'Buffers: 12464 kB' 'Cached: 9336096 kB' 'SwapCached: 0 kB' 'Active: 6440772 kB' 'Inactive: 3458012 kB' 'Active(anon): 6046160 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553448 kB' 'Mapped: 182624 kB' 'Shmem: 5495936 kB' 'KReclaimable: 252240 kB' 'Slab: 869504 kB' 'SReclaimable: 252240 kB' 'SUnreclaim: 617264 kB' 'KernelStack: 24992 kB' 'PageTables: 8100 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7576308 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226880 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.904 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.905 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:24.906 nr_hugepages=1024 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:24.906 resv_hugepages=0 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:24.906 surplus_hugepages=0 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:24.906 anon_hugepages=0 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.906 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110518496 kB' 'MemAvailable: 113672896 kB' 'Buffers: 12464 kB' 'Cached: 9336128 kB' 'SwapCached: 0 kB' 'Active: 6441300 kB' 'Inactive: 3458012 kB' 'Active(anon): 6046688 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553968 kB' 'Mapped: 182624 kB' 'Shmem: 5495968 kB' 'KReclaimable: 252240 kB' 'Slab: 869504 kB' 'SReclaimable: 252240 kB' 'SUnreclaim: 617264 kB' 'KernelStack: 24992 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7576332 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226880 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.907 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 62005596 kB' 'MemUsed: 3656404 kB' 'SwapCached: 0 kB' 'Active: 1508092 kB' 'Inactive: 129584 kB' 'Active(anon): 1189500 kB' 'Inactive(anon): 0 kB' 'Active(file): 318592 kB' 'Inactive(file): 129584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1379060 kB' 'Mapped: 115344 kB' 'AnonPages: 261768 kB' 'Shmem: 930884 kB' 'KernelStack: 14120 kB' 'PageTables: 5180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132620 kB' 'Slab: 405336 kB' 'SReclaimable: 132620 kB' 'SUnreclaim: 272716 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.908 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.909 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 48512016 kB' 'MemUsed: 12170020 kB' 'SwapCached: 0 kB' 'Active: 4933140 kB' 'Inactive: 3328428 kB' 'Active(anon): 4857120 kB' 'Inactive(anon): 0 kB' 'Active(file): 76020 kB' 'Inactive(file): 3328428 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7969576 kB' 'Mapped: 67280 kB' 'AnonPages: 292064 kB' 'Shmem: 4565128 kB' 'KernelStack: 11000 kB' 'PageTables: 3416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 119620 kB' 'Slab: 464168 kB' 'SReclaimable: 119620 kB' 'SUnreclaim: 344548 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.910 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:24.911 node0=512 expecting 512 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:24.911 node1=512 expecting 512 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:24.911 00:04:24.911 real 0m4.146s 00:04:24.911 user 0m1.621s 00:04:24.911 sys 0m2.594s 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:24.911 07:40:09 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:24.911 ************************************ 00:04:24.911 END TEST per_node_1G_alloc 00:04:24.911 ************************************ 00:04:24.911 07:40:09 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:24.911 07:40:09 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:24.911 07:40:09 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:24.911 07:40:09 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:24.911 07:40:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:24.911 ************************************ 00:04:24.911 START TEST even_2G_alloc 00:04:24.911 ************************************ 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.911 07:40:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:29.127 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:29.128 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110523300 kB' 'MemAvailable: 113677668 kB' 'Buffers: 12464 kB' 'Cached: 9336256 kB' 'SwapCached: 0 kB' 'Active: 6443156 kB' 'Inactive: 3458012 kB' 'Active(anon): 6048544 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555652 kB' 'Mapped: 182668 kB' 'Shmem: 5496096 kB' 'KReclaimable: 252176 kB' 'Slab: 869612 kB' 'SReclaimable: 252176 kB' 'SUnreclaim: 617436 kB' 'KernelStack: 24880 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7574348 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226736 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.128 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110528388 kB' 'MemAvailable: 113682740 kB' 'Buffers: 12464 kB' 'Cached: 9336260 kB' 'SwapCached: 0 kB' 'Active: 6442768 kB' 'Inactive: 3458012 kB' 'Active(anon): 6048156 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555300 kB' 'Mapped: 182612 kB' 'Shmem: 5496100 kB' 'KReclaimable: 252144 kB' 'Slab: 869604 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617460 kB' 'KernelStack: 24880 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7574364 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226720 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.129 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.130 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110528428 kB' 'MemAvailable: 113682780 kB' 'Buffers: 12464 kB' 'Cached: 9336276 kB' 'SwapCached: 0 kB' 'Active: 6442744 kB' 'Inactive: 3458012 kB' 'Active(anon): 6048132 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555300 kB' 'Mapped: 182612 kB' 'Shmem: 5496116 kB' 'KReclaimable: 252144 kB' 'Slab: 869604 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617460 kB' 'KernelStack: 24880 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7574388 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226720 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.131 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.132 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:29.133 nr_hugepages=1024 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:29.133 resv_hugepages=0 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:29.133 surplus_hugepages=0 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:29.133 anon_hugepages=0 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110529696 kB' 'MemAvailable: 113684048 kB' 'Buffers: 12464 kB' 'Cached: 9336296 kB' 'SwapCached: 0 kB' 'Active: 6442788 kB' 'Inactive: 3458012 kB' 'Active(anon): 6048176 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555344 kB' 'Mapped: 182672 kB' 'Shmem: 5496136 kB' 'KReclaimable: 252144 kB' 'Slab: 869604 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617460 kB' 'KernelStack: 24880 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7574408 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226720 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.133 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.134 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 62010364 kB' 'MemUsed: 3651636 kB' 'SwapCached: 0 kB' 'Active: 1510352 kB' 'Inactive: 129584 kB' 'Active(anon): 1191760 kB' 'Inactive(anon): 0 kB' 'Active(file): 318592 kB' 'Inactive(file): 129584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1379160 kB' 'Mapped: 115360 kB' 'AnonPages: 263972 kB' 'Shmem: 930984 kB' 'KernelStack: 13896 kB' 'PageTables: 4760 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132524 kB' 'Slab: 405340 kB' 'SReclaimable: 132524 kB' 'SUnreclaim: 272816 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.135 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 48519884 kB' 'MemUsed: 12162152 kB' 'SwapCached: 0 kB' 'Active: 4932416 kB' 'Inactive: 3328428 kB' 'Active(anon): 4856396 kB' 'Inactive(anon): 0 kB' 'Active(file): 76020 kB' 'Inactive(file): 3328428 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7969624 kB' 'Mapped: 67248 kB' 'AnonPages: 291336 kB' 'Shmem: 4565176 kB' 'KernelStack: 10984 kB' 'PageTables: 3420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 119620 kB' 'Slab: 464264 kB' 'SReclaimable: 119620 kB' 'SUnreclaim: 344644 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.136 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:29.137 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:29.138 node0=512 expecting 512 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:29.138 node1=512 expecting 512 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:29.138 00:04:29.138 real 0m4.152s 00:04:29.138 user 0m1.639s 00:04:29.138 sys 0m2.584s 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:29.138 07:40:13 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:29.138 ************************************ 00:04:29.138 END TEST even_2G_alloc 00:04:29.138 ************************************ 00:04:29.138 07:40:13 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:29.138 07:40:13 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:29.138 07:40:13 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:29.138 07:40:13 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:29.138 07:40:13 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:29.138 ************************************ 00:04:29.138 START TEST odd_alloc 00:04:29.138 ************************************ 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:29.138 07:40:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:33.353 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:33.353 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110518588 kB' 'MemAvailable: 113672940 kB' 'Buffers: 12464 kB' 'Cached: 9336444 kB' 'SwapCached: 0 kB' 'Active: 6445076 kB' 'Inactive: 3458012 kB' 'Active(anon): 6050464 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 557552 kB' 'Mapped: 182656 kB' 'Shmem: 5496284 kB' 'KReclaimable: 252144 kB' 'Slab: 869540 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617396 kB' 'KernelStack: 24912 kB' 'PageTables: 8256 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7575468 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226800 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.353 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.354 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110521068 kB' 'MemAvailable: 113675420 kB' 'Buffers: 12464 kB' 'Cached: 9336448 kB' 'SwapCached: 0 kB' 'Active: 6443724 kB' 'Inactive: 3458012 kB' 'Active(anon): 6049112 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 556172 kB' 'Mapped: 182632 kB' 'Shmem: 5496288 kB' 'KReclaimable: 252144 kB' 'Slab: 869592 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617448 kB' 'KernelStack: 24880 kB' 'PageTables: 8176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7575484 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226768 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.355 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:33.356 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110520312 kB' 'MemAvailable: 113674664 kB' 'Buffers: 12464 kB' 'Cached: 9336448 kB' 'SwapCached: 0 kB' 'Active: 6443760 kB' 'Inactive: 3458012 kB' 'Active(anon): 6049148 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 556204 kB' 'Mapped: 182632 kB' 'Shmem: 5496288 kB' 'KReclaimable: 252144 kB' 'Slab: 869592 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617448 kB' 'KernelStack: 24896 kB' 'PageTables: 8232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7575504 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226768 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.357 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.358 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:33.359 nr_hugepages=1025 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:33.359 resv_hugepages=0 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:33.359 surplus_hugepages=0 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:33.359 anon_hugepages=0 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110521304 kB' 'MemAvailable: 113675656 kB' 'Buffers: 12464 kB' 'Cached: 9336480 kB' 'SwapCached: 0 kB' 'Active: 6443452 kB' 'Inactive: 3458012 kB' 'Active(anon): 6048840 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555816 kB' 'Mapped: 182632 kB' 'Shmem: 5496320 kB' 'KReclaimable: 252144 kB' 'Slab: 869592 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617448 kB' 'KernelStack: 24880 kB' 'PageTables: 8176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7575528 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226768 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.359 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 62015328 kB' 'MemUsed: 3646672 kB' 'SwapCached: 0 kB' 'Active: 1511500 kB' 'Inactive: 129584 kB' 'Active(anon): 1192908 kB' 'Inactive(anon): 0 kB' 'Active(file): 318592 kB' 'Inactive(file): 129584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1379256 kB' 'Mapped: 115384 kB' 'AnonPages: 265012 kB' 'Shmem: 931080 kB' 'KernelStack: 13912 kB' 'PageTables: 4800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132524 kB' 'Slab: 405416 kB' 'SReclaimable: 132524 kB' 'SUnreclaim: 272892 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:33.360 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.361 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 48508856 kB' 'MemUsed: 12173180 kB' 'SwapCached: 0 kB' 'Active: 4932336 kB' 'Inactive: 3328428 kB' 'Active(anon): 4856316 kB' 'Inactive(anon): 0 kB' 'Active(file): 76020 kB' 'Inactive(file): 3328428 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7969732 kB' 'Mapped: 67248 kB' 'AnonPages: 291144 kB' 'Shmem: 4565284 kB' 'KernelStack: 10968 kB' 'PageTables: 3376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 119620 kB' 'Slab: 464160 kB' 'SReclaimable: 119620 kB' 'SUnreclaim: 344540 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.362 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:33.363 node0=512 expecting 513 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:33.363 node1=513 expecting 512 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:33.363 00:04:33.363 real 0m4.129s 00:04:33.363 user 0m1.607s 00:04:33.363 sys 0m2.596s 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:33.363 07:40:17 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:33.363 ************************************ 00:04:33.363 END TEST odd_alloc 00:04:33.363 ************************************ 00:04:33.363 07:40:17 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:33.363 07:40:17 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:33.363 07:40:17 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:33.363 07:40:17 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:33.363 07:40:17 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:33.363 ************************************ 00:04:33.363 START TEST custom_alloc 00:04:33.363 ************************************ 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:33.363 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.364 07:40:18 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:37.574 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:37.574 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109481784 kB' 'MemAvailable: 112636136 kB' 'Buffers: 12464 kB' 'Cached: 9336612 kB' 'SwapCached: 0 kB' 'Active: 6445504 kB' 'Inactive: 3458012 kB' 'Active(anon): 6050892 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 557700 kB' 'Mapped: 182676 kB' 'Shmem: 5496452 kB' 'KReclaimable: 252144 kB' 'Slab: 869692 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617548 kB' 'KernelStack: 24864 kB' 'PageTables: 8260 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7577532 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226640 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.574 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:21 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109485348 kB' 'MemAvailable: 112639700 kB' 'Buffers: 12464 kB' 'Cached: 9336612 kB' 'SwapCached: 0 kB' 'Active: 6444872 kB' 'Inactive: 3458012 kB' 'Active(anon): 6050260 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 557084 kB' 'Mapped: 182656 kB' 'Shmem: 5496452 kB' 'KReclaimable: 252144 kB' 'Slab: 869712 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617568 kB' 'KernelStack: 24832 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7577548 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226608 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.575 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.576 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109485560 kB' 'MemAvailable: 112639912 kB' 'Buffers: 12464 kB' 'Cached: 9336616 kB' 'SwapCached: 0 kB' 'Active: 6444544 kB' 'Inactive: 3458012 kB' 'Active(anon): 6049932 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 556752 kB' 'Mapped: 182656 kB' 'Shmem: 5496456 kB' 'KReclaimable: 252144 kB' 'Slab: 869712 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617568 kB' 'KernelStack: 24816 kB' 'PageTables: 8104 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7577572 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226624 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.577 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.578 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:37.579 nr_hugepages=1536 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:37.579 resv_hugepages=0 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:37.579 surplus_hugepages=0 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:37.579 anon_hugepages=0 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109483796 kB' 'MemAvailable: 112638148 kB' 'Buffers: 12464 kB' 'Cached: 9336616 kB' 'SwapCached: 0 kB' 'Active: 6446476 kB' 'Inactive: 3458012 kB' 'Active(anon): 6051864 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558648 kB' 'Mapped: 182656 kB' 'Shmem: 5496456 kB' 'KReclaimable: 252144 kB' 'Slab: 869712 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617568 kB' 'KernelStack: 24800 kB' 'PageTables: 8056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7597112 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226624 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.579 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.580 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 62022816 kB' 'MemUsed: 3639184 kB' 'SwapCached: 0 kB' 'Active: 1510152 kB' 'Inactive: 129584 kB' 'Active(anon): 1191560 kB' 'Inactive(anon): 0 kB' 'Active(file): 318592 kB' 'Inactive(file): 129584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1379276 kB' 'Mapped: 115404 kB' 'AnonPages: 263548 kB' 'Shmem: 931100 kB' 'KernelStack: 13864 kB' 'PageTables: 4660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132524 kB' 'Slab: 405296 kB' 'SReclaimable: 132524 kB' 'SUnreclaim: 272772 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.581 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 47461228 kB' 'MemUsed: 13220808 kB' 'SwapCached: 0 kB' 'Active: 4934860 kB' 'Inactive: 3328428 kB' 'Active(anon): 4858840 kB' 'Inactive(anon): 0 kB' 'Active(file): 76020 kB' 'Inactive(file): 3328428 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7969884 kB' 'Mapped: 67252 kB' 'AnonPages: 293584 kB' 'Shmem: 4565436 kB' 'KernelStack: 10920 kB' 'PageTables: 3284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 119620 kB' 'Slab: 464412 kB' 'SReclaimable: 119620 kB' 'SUnreclaim: 344792 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.582 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:37.583 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:37.583 node0=512 expecting 512 00:04:37.584 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:37.584 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:37.584 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:37.584 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:37.584 node1=1024 expecting 1024 00:04:37.584 07:40:22 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:37.584 00:04:37.584 real 0m4.121s 00:04:37.584 user 0m1.573s 00:04:37.584 sys 0m2.619s 00:04:37.584 07:40:22 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:37.584 07:40:22 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:37.584 ************************************ 00:04:37.584 END TEST custom_alloc 00:04:37.584 ************************************ 00:04:37.584 07:40:22 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:37.584 07:40:22 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:37.584 07:40:22 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:37.584 07:40:22 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:37.584 07:40:22 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:37.584 ************************************ 00:04:37.584 START TEST no_shrink_alloc 00:04:37.584 ************************************ 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:37.584 07:40:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:41.816 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:41.816 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.816 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110527284 kB' 'MemAvailable: 113681636 kB' 'Buffers: 12464 kB' 'Cached: 9336792 kB' 'SwapCached: 0 kB' 'Active: 6446692 kB' 'Inactive: 3458012 kB' 'Active(anon): 6052080 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558192 kB' 'Mapped: 182804 kB' 'Shmem: 5496632 kB' 'KReclaimable: 252144 kB' 'Slab: 869856 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617712 kB' 'KernelStack: 24976 kB' 'PageTables: 8324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7580100 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226864 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.817 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110527776 kB' 'MemAvailable: 113682128 kB' 'Buffers: 12464 kB' 'Cached: 9336796 kB' 'SwapCached: 0 kB' 'Active: 6446768 kB' 'Inactive: 3458012 kB' 'Active(anon): 6052156 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558188 kB' 'Mapped: 182768 kB' 'Shmem: 5496636 kB' 'KReclaimable: 252144 kB' 'Slab: 869844 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617700 kB' 'KernelStack: 24928 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7578500 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226704 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.818 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.819 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110529492 kB' 'MemAvailable: 113683844 kB' 'Buffers: 12464 kB' 'Cached: 9336816 kB' 'SwapCached: 0 kB' 'Active: 6446836 kB' 'Inactive: 3458012 kB' 'Active(anon): 6052224 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558824 kB' 'Mapped: 182692 kB' 'Shmem: 5496656 kB' 'KReclaimable: 252144 kB' 'Slab: 869860 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617716 kB' 'KernelStack: 24992 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7580140 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226816 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.820 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.821 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:41.822 nr_hugepages=1024 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:41.822 resv_hugepages=0 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:41.822 surplus_hugepages=0 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:41.822 anon_hugepages=0 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110530272 kB' 'MemAvailable: 113684624 kB' 'Buffers: 12464 kB' 'Cached: 9336836 kB' 'SwapCached: 0 kB' 'Active: 6447148 kB' 'Inactive: 3458012 kB' 'Active(anon): 6052536 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 559032 kB' 'Mapped: 182692 kB' 'Shmem: 5496676 kB' 'KReclaimable: 252144 kB' 'Slab: 869860 kB' 'SReclaimable: 252144 kB' 'SUnreclaim: 617716 kB' 'KernelStack: 24896 kB' 'PageTables: 8448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7580160 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226848 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.822 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:41.823 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60984984 kB' 'MemUsed: 4677016 kB' 'SwapCached: 0 kB' 'Active: 1510776 kB' 'Inactive: 129584 kB' 'Active(anon): 1192184 kB' 'Inactive(anon): 0 kB' 'Active(file): 318592 kB' 'Inactive(file): 129584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1379372 kB' 'Mapped: 115420 kB' 'AnonPages: 264108 kB' 'Shmem: 931196 kB' 'KernelStack: 13880 kB' 'PageTables: 4584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132524 kB' 'Slab: 405264 kB' 'SReclaimable: 132524 kB' 'SUnreclaim: 272740 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.824 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:41.825 node0=1024 expecting 1024 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:41.825 07:40:26 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:46.040 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:46.040 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:46.040 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110539728 kB' 'MemAvailable: 113694084 kB' 'Buffers: 12464 kB' 'Cached: 9336948 kB' 'SwapCached: 0 kB' 'Active: 6447260 kB' 'Inactive: 3458012 kB' 'Active(anon): 6052648 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558752 kB' 'Mapped: 182828 kB' 'Shmem: 5496788 kB' 'KReclaimable: 252152 kB' 'Slab: 869400 kB' 'SReclaimable: 252152 kB' 'SUnreclaim: 617248 kB' 'KernelStack: 24896 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7578388 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226736 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.040 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.041 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110540420 kB' 'MemAvailable: 113694776 kB' 'Buffers: 12464 kB' 'Cached: 9336952 kB' 'SwapCached: 0 kB' 'Active: 6447456 kB' 'Inactive: 3458012 kB' 'Active(anon): 6052844 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558984 kB' 'Mapped: 182828 kB' 'Shmem: 5496792 kB' 'KReclaimable: 252152 kB' 'Slab: 869392 kB' 'SReclaimable: 252152 kB' 'SUnreclaim: 617240 kB' 'KernelStack: 24880 kB' 'PageTables: 8188 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7578408 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226704 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.042 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.043 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110541580 kB' 'MemAvailable: 113695936 kB' 'Buffers: 12464 kB' 'Cached: 9336968 kB' 'SwapCached: 0 kB' 'Active: 6446664 kB' 'Inactive: 3458012 kB' 'Active(anon): 6052052 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558632 kB' 'Mapped: 183196 kB' 'Shmem: 5496808 kB' 'KReclaimable: 252152 kB' 'Slab: 869396 kB' 'SReclaimable: 252152 kB' 'SUnreclaim: 617244 kB' 'KernelStack: 24880 kB' 'PageTables: 8176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7579652 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226704 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.044 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:46.045 nr_hugepages=1024 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:46.045 resv_hugepages=0 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:46.045 surplus_hugepages=0 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:46.045 anon_hugepages=0 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:46.045 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110533964 kB' 'MemAvailable: 113688320 kB' 'Buffers: 12464 kB' 'Cached: 9336992 kB' 'SwapCached: 0 kB' 'Active: 6451104 kB' 'Inactive: 3458012 kB' 'Active(anon): 6056492 kB' 'Inactive(anon): 0 kB' 'Active(file): 394612 kB' 'Inactive(file): 3458012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563032 kB' 'Mapped: 183196 kB' 'Shmem: 5496832 kB' 'KReclaimable: 252152 kB' 'Slab: 869396 kB' 'SReclaimable: 252152 kB' 'SUnreclaim: 617244 kB' 'KernelStack: 24880 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7584572 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226688 kB' 'VmallocChunk: 0 kB' 'Percpu: 97280 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 721188 kB' 'DirectMap2M: 16785408 kB' 'DirectMap1G: 118489088 kB' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.046 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60978472 kB' 'MemUsed: 4683528 kB' 'SwapCached: 0 kB' 'Active: 1512780 kB' 'Inactive: 129584 kB' 'Active(anon): 1194188 kB' 'Inactive(anon): 0 kB' 'Active(file): 318592 kB' 'Inactive(file): 129584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1379504 kB' 'Mapped: 115588 kB' 'AnonPages: 266172 kB' 'Shmem: 931328 kB' 'KernelStack: 13912 kB' 'PageTables: 4848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 132524 kB' 'Slab: 405028 kB' 'SReclaimable: 132524 kB' 'SUnreclaim: 272504 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.047 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.048 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:46.049 node0=1024 expecting 1024 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:46.049 00:04:46.049 real 0m8.224s 00:04:46.049 user 0m3.140s 00:04:46.049 sys 0m5.221s 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:46.049 07:40:30 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:46.049 ************************************ 00:04:46.049 END TEST no_shrink_alloc 00:04:46.049 ************************************ 00:04:46.049 07:40:30 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:46.049 07:40:30 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:46.049 00:04:46.049 real 0m31.547s 00:04:46.049 user 0m11.454s 00:04:46.049 sys 0m18.700s 00:04:46.049 07:40:30 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:46.049 07:40:30 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:46.049 ************************************ 00:04:46.049 END TEST hugepages 00:04:46.049 ************************************ 00:04:46.049 07:40:30 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:46.049 07:40:30 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:46.049 07:40:30 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:46.049 07:40:30 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:46.049 07:40:30 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:46.049 ************************************ 00:04:46.049 START TEST driver 00:04:46.049 ************************************ 00:04:46.049 07:40:30 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:46.049 * Looking for test storage... 00:04:46.049 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:46.049 07:40:30 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:46.049 07:40:30 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:46.049 07:40:30 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:51.374 07:40:35 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:51.374 07:40:35 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:51.374 07:40:35 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:51.374 07:40:35 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:51.374 ************************************ 00:04:51.374 START TEST guess_driver 00:04:51.374 ************************************ 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 370 > 0 )) 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:51.374 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:51.374 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:51.374 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:51.374 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:51.374 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:51.374 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:51.374 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:51.374 Looking for driver=vfio-pci 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.374 07:40:35 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.595 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:55.596 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:55.596 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:55.596 07:40:39 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:56.980 07:40:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:56.980 07:40:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:56.980 07:40:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:57.240 07:40:41 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:57.240 07:40:41 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:57.240 07:40:41 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:57.240 07:40:41 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:02.525 00:05:02.525 real 0m10.997s 00:05:02.525 user 0m2.970s 00:05:02.525 sys 0m5.514s 00:05:02.525 07:40:46 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:02.525 07:40:46 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:02.525 ************************************ 00:05:02.525 END TEST guess_driver 00:05:02.525 ************************************ 00:05:02.525 07:40:47 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:05:02.525 00:05:02.525 real 0m16.436s 00:05:02.525 user 0m4.621s 00:05:02.525 sys 0m8.479s 00:05:02.525 07:40:47 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:02.525 07:40:47 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:02.525 ************************************ 00:05:02.525 END TEST driver 00:05:02.525 ************************************ 00:05:02.525 07:40:47 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:02.525 07:40:47 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:02.525 07:40:47 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:02.526 07:40:47 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:02.526 07:40:47 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:02.526 ************************************ 00:05:02.526 START TEST devices 00:05:02.526 ************************************ 00:05:02.526 07:40:47 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:02.526 * Looking for test storage... 00:05:02.526 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:02.526 07:40:47 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:02.526 07:40:47 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:02.526 07:40:47 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:02.526 07:40:47 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:65:00.0 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:07.813 07:40:51 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:07.813 07:40:51 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:07.813 No valid GPT data, bailing 00:05:07.813 07:40:51 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:07.813 07:40:51 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:07.813 07:40:51 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:07.813 07:40:51 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:07.813 07:40:51 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:07.813 07:40:51 setup.sh.devices -- setup/common.sh@80 -- # echo 2000398934016 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@204 -- # (( 2000398934016 >= min_disk_size )) 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:65:00.0 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:07.813 07:40:51 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:07.813 07:40:51 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:07.813 ************************************ 00:05:07.813 START TEST nvme_mount 00:05:07.813 ************************************ 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:07.813 07:40:51 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:08.074 Creating new GPT entries in memory. 00:05:08.074 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:08.074 other utilities. 00:05:08.074 07:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:08.074 07:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:08.074 07:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:08.074 07:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:08.074 07:40:52 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:09.015 Creating new GPT entries in memory. 00:05:09.015 The operation has completed successfully. 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1518266 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:65:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:09.016 07:40:53 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:13.222 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:13.222 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:13.222 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:05:13.222 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:13.222 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:13.222 07:40:57 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:13.483 07:40:57 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.483 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:65:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:13.483 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:13.483 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:13.483 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:13.484 07:40:57 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.695 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:65:00.0 data@nvme0n1 '' '' 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.696 07:41:01 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:21.012 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:21.273 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:21.273 00:05:21.273 real 0m14.286s 00:05:21.273 user 0m4.346s 00:05:21.273 sys 0m7.816s 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:21.273 07:41:05 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:21.273 ************************************ 00:05:21.273 END TEST nvme_mount 00:05:21.273 ************************************ 00:05:21.273 07:41:05 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:21.273 07:41:05 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:21.273 07:41:05 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:21.273 07:41:05 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:21.273 07:41:05 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:21.273 ************************************ 00:05:21.273 START TEST dm_mount 00:05:21.273 ************************************ 00:05:21.273 07:41:05 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:21.273 07:41:05 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:21.273 07:41:05 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:21.273 07:41:05 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:21.273 07:41:05 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:21.273 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:21.273 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:21.273 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:21.274 07:41:05 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:22.658 Creating new GPT entries in memory. 00:05:22.658 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:22.658 other utilities. 00:05:22.658 07:41:06 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:22.658 07:41:06 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:22.658 07:41:06 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:22.658 07:41:06 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:22.658 07:41:06 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:23.637 Creating new GPT entries in memory. 00:05:23.637 The operation has completed successfully. 00:05:23.637 07:41:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:23.637 07:41:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:23.637 07:41:08 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:23.637 07:41:08 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:23.637 07:41:08 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:24.583 The operation has completed successfully. 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1523461 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:65:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:24.583 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:24.584 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:24.584 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.584 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:24.584 07:41:09 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:24.584 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:24.584 07:41:09 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:28.787 07:41:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:65:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:28.787 07:41:13 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:32.186 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:32.447 07:41:16 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:32.447 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:32.447 07:41:17 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:32.447 07:41:17 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:32.447 00:05:32.447 real 0m11.039s 00:05:32.447 user 0m2.887s 00:05:32.447 sys 0m5.212s 00:05:32.447 07:41:17 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:32.447 07:41:17 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:32.447 ************************************ 00:05:32.447 END TEST dm_mount 00:05:32.447 ************************************ 00:05:32.447 07:41:17 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:32.447 07:41:17 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:32.447 07:41:17 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:32.447 07:41:17 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.447 07:41:17 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:32.447 07:41:17 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:32.447 07:41:17 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:32.447 07:41:17 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:32.708 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:32.708 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:05:32.708 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:32.708 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:32.708 07:41:17 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:32.708 07:41:17 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:32.708 07:41:17 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:32.708 07:41:17 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:32.708 07:41:17 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:32.708 07:41:17 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:32.708 07:41:17 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:32.709 00:05:32.709 real 0m30.252s 00:05:32.709 user 0m8.955s 00:05:32.709 sys 0m16.117s 00:05:32.709 07:41:17 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:32.709 07:41:17 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:32.709 ************************************ 00:05:32.709 END TEST devices 00:05:32.709 ************************************ 00:05:32.709 07:41:17 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:32.709 00:05:32.709 real 1m46.851s 00:05:32.709 user 0m34.560s 00:05:32.709 sys 1m0.120s 00:05:32.709 07:41:17 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:32.709 07:41:17 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:32.709 ************************************ 00:05:32.709 END TEST setup.sh 00:05:32.709 ************************************ 00:05:32.709 07:41:17 -- common/autotest_common.sh@1142 -- # return 0 00:05:32.709 07:41:17 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:36.916 Hugepages 00:05:36.916 node hugesize free / total 00:05:36.916 node0 1048576kB 0 / 0 00:05:36.916 node0 2048kB 1024 / 1024 00:05:36.916 node1 1048576kB 0 / 0 00:05:36.916 node1 2048kB 1024 / 1024 00:05:36.916 00:05:36.916 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:36.916 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:05:36.916 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:05:36.916 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:05:36.916 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:05:36.916 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:05:36.916 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:05:36.916 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:05:36.916 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:05:36.916 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:05:36.916 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:05:36.916 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:05:36.916 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:05:36.916 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:05:36.916 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:05:36.916 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:05:36.916 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:05:36.916 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:05:36.916 07:41:21 -- spdk/autotest.sh@130 -- # uname -s 00:05:36.916 07:41:21 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:36.916 07:41:21 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:36.916 07:41:21 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:41.119 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:41.119 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:42.503 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:05:42.763 07:41:27 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:43.706 07:41:28 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:43.706 07:41:28 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:43.706 07:41:28 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:43.706 07:41:28 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:43.706 07:41:28 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:43.706 07:41:28 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:43.706 07:41:28 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:43.706 07:41:28 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:43.706 07:41:28 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:43.706 07:41:28 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:43.706 07:41:28 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:65:00.0 00:05:43.706 07:41:28 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:47.911 Waiting for block devices as requested 00:05:47.911 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:05:47.911 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:05:47.911 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:05:47.911 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:05:47.911 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:05:47.911 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:05:47.911 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:05:48.173 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:05:48.173 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:05:48.173 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:05:48.432 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:05:48.432 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:05:48.432 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:05:48.697 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:05:48.697 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:05:48.697 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:05:48.697 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:05:48.961 07:41:33 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:48.961 07:41:33 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:65:00.0 00:05:48.961 07:41:33 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:48.961 07:41:33 -- common/autotest_common.sh@1502 -- # grep 0000:65:00.0/nvme/nvme 00:05:48.961 07:41:33 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:05:48.961 07:41:33 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 ]] 00:05:48.961 07:41:33 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:05:48.961 07:41:33 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:48.961 07:41:33 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:48.961 07:41:33 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:48.961 07:41:33 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:48.961 07:41:33 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:48.961 07:41:33 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:48.961 07:41:33 -- common/autotest_common.sh@1545 -- # oacs=' 0xe' 00:05:48.961 07:41:33 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:48.961 07:41:33 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:48.961 07:41:33 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:48.961 07:41:33 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:48.961 07:41:33 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:48.961 07:41:33 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:48.961 07:41:33 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:48.961 07:41:33 -- common/autotest_common.sh@1557 -- # continue 00:05:48.961 07:41:33 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:48.961 07:41:33 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:48.961 07:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:48.961 07:41:33 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:48.961 07:41:33 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:48.961 07:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:48.961 07:41:33 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:53.176 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:53.176 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:55.088 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:05:55.088 07:41:39 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:55.088 07:41:39 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:55.088 07:41:39 -- common/autotest_common.sh@10 -- # set +x 00:05:55.088 07:41:39 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:55.088 07:41:39 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:05:55.088 07:41:39 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:05:55.088 07:41:39 -- common/autotest_common.sh@1577 -- # bdfs=() 00:05:55.088 07:41:39 -- common/autotest_common.sh@1577 -- # local bdfs 00:05:55.088 07:41:39 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:05:55.088 07:41:39 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:55.088 07:41:39 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:55.088 07:41:39 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:55.088 07:41:39 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:55.088 07:41:39 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:55.088 07:41:39 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:55.088 07:41:39 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:65:00.0 00:05:55.088 07:41:39 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:05:55.088 07:41:39 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:65:00.0/device 00:05:55.088 07:41:39 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:05:55.088 07:41:39 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:55.088 07:41:39 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:05:55.088 07:41:39 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:65:00.0 00:05:55.088 07:41:39 -- common/autotest_common.sh@1592 -- # [[ -z 0000:65:00.0 ]] 00:05:55.088 07:41:39 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=1534675 00:05:55.088 07:41:39 -- common/autotest_common.sh@1598 -- # waitforlisten 1534675 00:05:55.088 07:41:39 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:55.088 07:41:39 -- common/autotest_common.sh@829 -- # '[' -z 1534675 ']' 00:05:55.088 07:41:39 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:55.088 07:41:39 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:55.088 07:41:39 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:55.088 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:55.088 07:41:39 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:55.088 07:41:39 -- common/autotest_common.sh@10 -- # set +x 00:05:55.088 [2024-07-15 07:41:39.695985] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:05:55.088 [2024-07-15 07:41:39.696042] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1534675 ] 00:05:55.088 [2024-07-15 07:41:39.781577] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.349 [2024-07-15 07:41:39.874306] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.920 07:41:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:55.921 07:41:40 -- common/autotest_common.sh@862 -- # return 0 00:05:55.921 07:41:40 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:05:55.921 07:41:40 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:05:55.921 07:41:40 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:65:00.0 00:05:59.222 nvme0n1 00:05:59.222 07:41:43 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:59.222 [2024-07-15 07:41:43.789066] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:59.222 request: 00:05:59.222 { 00:05:59.222 "nvme_ctrlr_name": "nvme0", 00:05:59.222 "password": "test", 00:05:59.222 "method": "bdev_nvme_opal_revert", 00:05:59.222 "req_id": 1 00:05:59.222 } 00:05:59.222 Got JSON-RPC error response 00:05:59.222 response: 00:05:59.222 { 00:05:59.222 "code": -32602, 00:05:59.222 "message": "Invalid parameters" 00:05:59.222 } 00:05:59.222 07:41:43 -- common/autotest_common.sh@1604 -- # true 00:05:59.222 07:41:43 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:05:59.222 07:41:43 -- common/autotest_common.sh@1608 -- # killprocess 1534675 00:05:59.222 07:41:43 -- common/autotest_common.sh@948 -- # '[' -z 1534675 ']' 00:05:59.222 07:41:43 -- common/autotest_common.sh@952 -- # kill -0 1534675 00:05:59.222 07:41:43 -- common/autotest_common.sh@953 -- # uname 00:05:59.222 07:41:43 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:05:59.222 07:41:43 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1534675 00:05:59.222 07:41:43 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:05:59.222 07:41:43 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:05:59.222 07:41:43 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1534675' 00:05:59.222 killing process with pid 1534675 00:05:59.222 07:41:43 -- common/autotest_common.sh@967 -- # kill 1534675 00:05:59.222 07:41:43 -- common/autotest_common.sh@972 -- # wait 1534675 00:06:01.768 07:41:46 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:01.768 07:41:46 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:01.768 07:41:46 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:01.768 07:41:46 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:01.768 07:41:46 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:02.336 Restarting all devices. 00:06:05.651 lstat() error: No such file or directory 00:06:05.651 QAT Error: No GENERAL section found 00:06:05.651 Failed to configure qat_dev0 00:06:05.651 lstat() error: No such file or directory 00:06:05.651 QAT Error: No GENERAL section found 00:06:05.651 Failed to configure qat_dev1 00:06:05.651 lstat() error: No such file or directory 00:06:05.651 QAT Error: No GENERAL section found 00:06:05.651 Failed to configure qat_dev2 00:06:05.651 enable sriov 00:06:05.651 Checking status of all devices. 00:06:05.651 There is 3 QAT acceleration device(s) in the system: 00:06:05.651 qat_dev0 - type: c6xx, inst_id: 0, node_id: 1, bsf: 0000:cc:00.0, #accel: 5 #engines: 10 state: down 00:06:05.652 qat_dev1 - type: c6xx, inst_id: 1, node_id: 1, bsf: 0000:ce:00.0, #accel: 5 #engines: 10 state: down 00:06:05.652 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:d0:00.0, #accel: 5 #engines: 10 state: down 00:06:06.223 0000:cc:00.0 set to 16 VFs 00:06:06.795 0000:ce:00.0 set to 16 VFs 00:06:07.366 0000:d0:00.0 set to 16 VFs 00:06:07.366 Properly configured the qat device with driver uio_pci_generic. 00:06:07.366 07:41:52 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:07.366 07:41:52 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:07.366 07:41:52 -- common/autotest_common.sh@10 -- # set +x 00:06:07.366 07:41:52 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:07.366 07:41:52 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:07.366 07:41:52 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:07.366 07:41:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:07.366 07:41:52 -- common/autotest_common.sh@10 -- # set +x 00:06:07.628 ************************************ 00:06:07.628 START TEST env 00:06:07.628 ************************************ 00:06:07.628 07:41:52 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:07.628 * Looking for test storage... 00:06:07.628 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:07.628 07:41:52 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:07.628 07:41:52 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:07.628 07:41:52 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:07.628 07:41:52 env -- common/autotest_common.sh@10 -- # set +x 00:06:07.628 ************************************ 00:06:07.628 START TEST env_memory 00:06:07.628 ************************************ 00:06:07.628 07:41:52 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:07.628 00:06:07.628 00:06:07.628 CUnit - A unit testing framework for C - Version 2.1-3 00:06:07.628 http://cunit.sourceforge.net/ 00:06:07.628 00:06:07.628 00:06:07.628 Suite: memory 00:06:07.628 Test: alloc and free memory map ...[2024-07-15 07:41:52.342129] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:07.628 passed 00:06:07.628 Test: mem map translation ...[2024-07-15 07:41:52.365775] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:07.628 [2024-07-15 07:41:52.365802] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:07.628 [2024-07-15 07:41:52.365847] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:07.628 [2024-07-15 07:41:52.365854] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:07.890 passed 00:06:07.890 Test: mem map registration ...[2024-07-15 07:41:52.416955] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:07.890 [2024-07-15 07:41:52.416978] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:07.890 passed 00:06:07.890 Test: mem map adjacent registrations ...passed 00:06:07.890 00:06:07.890 Run Summary: Type Total Ran Passed Failed Inactive 00:06:07.890 suites 1 1 n/a 0 0 00:06:07.890 tests 4 4 4 0 0 00:06:07.890 asserts 152 152 152 0 n/a 00:06:07.890 00:06:07.890 Elapsed time = 0.181 seconds 00:06:07.890 00:06:07.890 real 0m0.194s 00:06:07.890 user 0m0.184s 00:06:07.890 sys 0m0.009s 00:06:07.890 07:41:52 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:07.890 07:41:52 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:07.890 ************************************ 00:06:07.890 END TEST env_memory 00:06:07.890 ************************************ 00:06:07.890 07:41:52 env -- common/autotest_common.sh@1142 -- # return 0 00:06:07.890 07:41:52 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:07.890 07:41:52 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:07.890 07:41:52 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:07.890 07:41:52 env -- common/autotest_common.sh@10 -- # set +x 00:06:07.890 ************************************ 00:06:07.890 START TEST env_vtophys 00:06:07.890 ************************************ 00:06:07.890 07:41:52 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:07.890 EAL: lib.eal log level changed from notice to debug 00:06:07.890 EAL: Detected lcore 0 as core 0 on socket 0 00:06:07.890 EAL: Detected lcore 1 as core 1 on socket 0 00:06:07.890 EAL: Detected lcore 2 as core 2 on socket 0 00:06:07.890 EAL: Detected lcore 3 as core 3 on socket 0 00:06:07.890 EAL: Detected lcore 4 as core 4 on socket 0 00:06:07.890 EAL: Detected lcore 5 as core 5 on socket 0 00:06:07.890 EAL: Detected lcore 6 as core 6 on socket 0 00:06:07.890 EAL: Detected lcore 7 as core 7 on socket 0 00:06:07.890 EAL: Detected lcore 8 as core 8 on socket 0 00:06:07.890 EAL: Detected lcore 9 as core 9 on socket 0 00:06:07.890 EAL: Detected lcore 10 as core 10 on socket 0 00:06:07.890 EAL: Detected lcore 11 as core 11 on socket 0 00:06:07.890 EAL: Detected lcore 12 as core 12 on socket 0 00:06:07.890 EAL: Detected lcore 13 as core 13 on socket 0 00:06:07.890 EAL: Detected lcore 14 as core 14 on socket 0 00:06:07.890 EAL: Detected lcore 15 as core 15 on socket 0 00:06:07.890 EAL: Detected lcore 16 as core 16 on socket 0 00:06:07.890 EAL: Detected lcore 17 as core 17 on socket 0 00:06:07.890 EAL: Detected lcore 18 as core 18 on socket 0 00:06:07.890 EAL: Detected lcore 19 as core 19 on socket 0 00:06:07.890 EAL: Detected lcore 20 as core 20 on socket 0 00:06:07.890 EAL: Detected lcore 21 as core 21 on socket 0 00:06:07.890 EAL: Detected lcore 22 as core 22 on socket 0 00:06:07.890 EAL: Detected lcore 23 as core 23 on socket 0 00:06:07.890 EAL: Detected lcore 24 as core 24 on socket 0 00:06:07.890 EAL: Detected lcore 25 as core 25 on socket 0 00:06:07.890 EAL: Detected lcore 26 as core 26 on socket 0 00:06:07.890 EAL: Detected lcore 27 as core 27 on socket 0 00:06:07.890 EAL: Detected lcore 28 as core 28 on socket 0 00:06:07.890 EAL: Detected lcore 29 as core 29 on socket 0 00:06:07.890 EAL: Detected lcore 30 as core 30 on socket 0 00:06:07.890 EAL: Detected lcore 31 as core 31 on socket 0 00:06:07.890 EAL: Detected lcore 32 as core 0 on socket 1 00:06:07.890 EAL: Detected lcore 33 as core 1 on socket 1 00:06:07.890 EAL: Detected lcore 34 as core 2 on socket 1 00:06:07.890 EAL: Detected lcore 35 as core 3 on socket 1 00:06:07.890 EAL: Detected lcore 36 as core 4 on socket 1 00:06:07.890 EAL: Detected lcore 37 as core 5 on socket 1 00:06:07.890 EAL: Detected lcore 38 as core 6 on socket 1 00:06:07.890 EAL: Detected lcore 39 as core 7 on socket 1 00:06:07.890 EAL: Detected lcore 40 as core 8 on socket 1 00:06:07.890 EAL: Detected lcore 41 as core 9 on socket 1 00:06:07.890 EAL: Detected lcore 42 as core 10 on socket 1 00:06:07.890 EAL: Detected lcore 43 as core 11 on socket 1 00:06:07.890 EAL: Detected lcore 44 as core 12 on socket 1 00:06:07.890 EAL: Detected lcore 45 as core 13 on socket 1 00:06:07.890 EAL: Detected lcore 46 as core 14 on socket 1 00:06:07.890 EAL: Detected lcore 47 as core 15 on socket 1 00:06:07.890 EAL: Detected lcore 48 as core 16 on socket 1 00:06:07.890 EAL: Detected lcore 49 as core 17 on socket 1 00:06:07.890 EAL: Detected lcore 50 as core 18 on socket 1 00:06:07.890 EAL: Detected lcore 51 as core 19 on socket 1 00:06:07.890 EAL: Detected lcore 52 as core 20 on socket 1 00:06:07.890 EAL: Detected lcore 53 as core 21 on socket 1 00:06:07.890 EAL: Detected lcore 54 as core 22 on socket 1 00:06:07.890 EAL: Detected lcore 55 as core 23 on socket 1 00:06:07.890 EAL: Detected lcore 56 as core 24 on socket 1 00:06:07.890 EAL: Detected lcore 57 as core 25 on socket 1 00:06:07.890 EAL: Detected lcore 58 as core 26 on socket 1 00:06:07.890 EAL: Detected lcore 59 as core 27 on socket 1 00:06:07.890 EAL: Detected lcore 60 as core 28 on socket 1 00:06:07.890 EAL: Detected lcore 61 as core 29 on socket 1 00:06:07.890 EAL: Detected lcore 62 as core 30 on socket 1 00:06:07.890 EAL: Detected lcore 63 as core 31 on socket 1 00:06:07.890 EAL: Detected lcore 64 as core 0 on socket 0 00:06:07.890 EAL: Detected lcore 65 as core 1 on socket 0 00:06:07.890 EAL: Detected lcore 66 as core 2 on socket 0 00:06:07.890 EAL: Detected lcore 67 as core 3 on socket 0 00:06:07.890 EAL: Detected lcore 68 as core 4 on socket 0 00:06:07.890 EAL: Detected lcore 69 as core 5 on socket 0 00:06:07.890 EAL: Detected lcore 70 as core 6 on socket 0 00:06:07.890 EAL: Detected lcore 71 as core 7 on socket 0 00:06:07.890 EAL: Detected lcore 72 as core 8 on socket 0 00:06:07.890 EAL: Detected lcore 73 as core 9 on socket 0 00:06:07.890 EAL: Detected lcore 74 as core 10 on socket 0 00:06:07.890 EAL: Detected lcore 75 as core 11 on socket 0 00:06:07.890 EAL: Detected lcore 76 as core 12 on socket 0 00:06:07.890 EAL: Detected lcore 77 as core 13 on socket 0 00:06:07.890 EAL: Detected lcore 78 as core 14 on socket 0 00:06:07.890 EAL: Detected lcore 79 as core 15 on socket 0 00:06:07.890 EAL: Detected lcore 80 as core 16 on socket 0 00:06:07.890 EAL: Detected lcore 81 as core 17 on socket 0 00:06:07.890 EAL: Detected lcore 82 as core 18 on socket 0 00:06:07.890 EAL: Detected lcore 83 as core 19 on socket 0 00:06:07.890 EAL: Detected lcore 84 as core 20 on socket 0 00:06:07.890 EAL: Detected lcore 85 as core 21 on socket 0 00:06:07.890 EAL: Detected lcore 86 as core 22 on socket 0 00:06:07.890 EAL: Detected lcore 87 as core 23 on socket 0 00:06:07.890 EAL: Detected lcore 88 as core 24 on socket 0 00:06:07.890 EAL: Detected lcore 89 as core 25 on socket 0 00:06:07.890 EAL: Detected lcore 90 as core 26 on socket 0 00:06:07.890 EAL: Detected lcore 91 as core 27 on socket 0 00:06:07.890 EAL: Detected lcore 92 as core 28 on socket 0 00:06:07.890 EAL: Detected lcore 93 as core 29 on socket 0 00:06:07.890 EAL: Detected lcore 94 as core 30 on socket 0 00:06:07.890 EAL: Detected lcore 95 as core 31 on socket 0 00:06:07.890 EAL: Detected lcore 96 as core 0 on socket 1 00:06:07.890 EAL: Detected lcore 97 as core 1 on socket 1 00:06:07.890 EAL: Detected lcore 98 as core 2 on socket 1 00:06:07.890 EAL: Detected lcore 99 as core 3 on socket 1 00:06:07.890 EAL: Detected lcore 100 as core 4 on socket 1 00:06:07.890 EAL: Detected lcore 101 as core 5 on socket 1 00:06:07.890 EAL: Detected lcore 102 as core 6 on socket 1 00:06:07.890 EAL: Detected lcore 103 as core 7 on socket 1 00:06:07.890 EAL: Detected lcore 104 as core 8 on socket 1 00:06:07.890 EAL: Detected lcore 105 as core 9 on socket 1 00:06:07.890 EAL: Detected lcore 106 as core 10 on socket 1 00:06:07.890 EAL: Detected lcore 107 as core 11 on socket 1 00:06:07.890 EAL: Detected lcore 108 as core 12 on socket 1 00:06:07.891 EAL: Detected lcore 109 as core 13 on socket 1 00:06:07.891 EAL: Detected lcore 110 as core 14 on socket 1 00:06:07.891 EAL: Detected lcore 111 as core 15 on socket 1 00:06:07.891 EAL: Detected lcore 112 as core 16 on socket 1 00:06:07.891 EAL: Detected lcore 113 as core 17 on socket 1 00:06:07.891 EAL: Detected lcore 114 as core 18 on socket 1 00:06:07.891 EAL: Detected lcore 115 as core 19 on socket 1 00:06:07.891 EAL: Detected lcore 116 as core 20 on socket 1 00:06:07.891 EAL: Detected lcore 117 as core 21 on socket 1 00:06:07.891 EAL: Detected lcore 118 as core 22 on socket 1 00:06:07.891 EAL: Detected lcore 119 as core 23 on socket 1 00:06:07.891 EAL: Detected lcore 120 as core 24 on socket 1 00:06:07.891 EAL: Detected lcore 121 as core 25 on socket 1 00:06:07.891 EAL: Detected lcore 122 as core 26 on socket 1 00:06:07.891 EAL: Detected lcore 123 as core 27 on socket 1 00:06:07.891 EAL: Detected lcore 124 as core 28 on socket 1 00:06:07.891 EAL: Detected lcore 125 as core 29 on socket 1 00:06:07.891 EAL: Detected lcore 126 as core 30 on socket 1 00:06:07.891 EAL: Detected lcore 127 as core 31 on socket 1 00:06:07.891 EAL: Maximum logical cores by configuration: 128 00:06:07.891 EAL: Detected CPU lcores: 128 00:06:07.891 EAL: Detected NUMA nodes: 2 00:06:07.891 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:07.891 EAL: Detected shared linkage of DPDK 00:06:07.891 EAL: No shared files mode enabled, IPC will be disabled 00:06:07.891 EAL: No shared files mode enabled, IPC is disabled 00:06:07.891 EAL: PCI driver qat for device 0000:cc:01.0 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:01.1 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:01.2 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:01.3 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:01.4 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:01.5 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:01.6 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:01.7 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:02.0 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:02.1 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:02.2 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:02.3 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:02.4 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:02.5 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:02.6 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:cc:02.7 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:01.0 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:01.1 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:01.2 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:01.3 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:01.4 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:01.5 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:01.6 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:01.7 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:02.0 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:02.1 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:02.2 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:02.3 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:02.4 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:02.5 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:02.6 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:ce:02.7 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:01.0 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:01.1 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:01.2 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:01.3 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:01.4 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:01.5 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:01.6 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:01.7 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:02.0 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:02.1 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:02.2 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:02.3 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:02.4 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:02.5 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:02.6 wants IOVA as 'PA' 00:06:07.891 EAL: PCI driver qat for device 0000:d0:02.7 wants IOVA as 'PA' 00:06:07.891 EAL: Bus pci wants IOVA as 'PA' 00:06:07.891 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:07.891 EAL: Bus vdev wants IOVA as 'DC' 00:06:07.891 EAL: Selected IOVA mode 'PA' 00:06:07.891 EAL: Probing VFIO support... 00:06:07.891 EAL: IOMMU type 1 (Type 1) is supported 00:06:07.891 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:07.891 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:07.891 EAL: VFIO support initialized 00:06:07.891 EAL: Ask a virtual area of 0x2e000 bytes 00:06:07.891 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:07.891 EAL: Setting up physically contiguous memory... 00:06:07.891 EAL: Setting maximum number of open files to 524288 00:06:07.891 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:07.891 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:07.891 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:07.891 EAL: Ask a virtual area of 0x61000 bytes 00:06:07.891 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:07.891 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:07.891 EAL: Ask a virtual area of 0x400000000 bytes 00:06:07.891 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:07.891 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:07.891 EAL: Ask a virtual area of 0x61000 bytes 00:06:07.891 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:07.891 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:07.891 EAL: Ask a virtual area of 0x400000000 bytes 00:06:07.891 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:07.891 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:07.891 EAL: Ask a virtual area of 0x61000 bytes 00:06:07.891 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:07.891 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:07.891 EAL: Ask a virtual area of 0x400000000 bytes 00:06:07.891 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:07.891 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:07.891 EAL: Ask a virtual area of 0x61000 bytes 00:06:07.891 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:07.891 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:07.891 EAL: Ask a virtual area of 0x400000000 bytes 00:06:07.891 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:07.891 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:07.891 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:07.891 EAL: Ask a virtual area of 0x61000 bytes 00:06:07.891 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:07.891 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:07.891 EAL: Ask a virtual area of 0x400000000 bytes 00:06:07.891 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:07.891 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:07.891 EAL: Ask a virtual area of 0x61000 bytes 00:06:07.891 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:07.891 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:07.891 EAL: Ask a virtual area of 0x400000000 bytes 00:06:07.891 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:07.891 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:07.891 EAL: Ask a virtual area of 0x61000 bytes 00:06:07.891 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:07.891 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:07.891 EAL: Ask a virtual area of 0x400000000 bytes 00:06:07.891 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:07.891 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:07.891 EAL: Ask a virtual area of 0x61000 bytes 00:06:07.891 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:07.891 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:07.891 EAL: Ask a virtual area of 0x400000000 bytes 00:06:07.891 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:07.891 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:07.891 EAL: Hugepages will be freed exactly as allocated. 00:06:07.891 EAL: No shared files mode enabled, IPC is disabled 00:06:07.891 EAL: No shared files mode enabled, IPC is disabled 00:06:07.891 EAL: TSC frequency is ~2600000 KHz 00:06:07.891 EAL: Main lcore 0 is ready (tid=7f05cd933b00;cpuset=[0]) 00:06:07.891 EAL: Trying to obtain current memory policy. 00:06:07.891 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:07.891 EAL: Restoring previous memory policy: 0 00:06:07.891 EAL: request: mp_malloc_sync 00:06:07.891 EAL: No shared files mode enabled, IPC is disabled 00:06:07.891 EAL: Heap on socket 0 was expanded by 2MB 00:06:07.891 EAL: PCI device 0000:cc:01.0 on NUMA socket 1 00:06:07.891 EAL: probe driver: 8086:37c9 qat 00:06:07.891 EAL: PCI memory mapped at 0x202001000000 00:06:07.891 EAL: PCI memory mapped at 0x202001001000 00:06:07.891 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:07.891 EAL: Trying to obtain current memory policy. 00:06:07.891 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:07.891 EAL: Restoring previous memory policy: 4 00:06:07.891 EAL: request: mp_malloc_sync 00:06:07.891 EAL: No shared files mode enabled, IPC is disabled 00:06:07.891 EAL: Heap on socket 1 was expanded by 2MB 00:06:07.891 EAL: PCI device 0000:cc:01.1 on NUMA socket 1 00:06:07.891 EAL: probe driver: 8086:37c9 qat 00:06:07.891 EAL: PCI memory mapped at 0x202001002000 00:06:07.891 EAL: PCI memory mapped at 0x202001003000 00:06:07.891 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:07.891 EAL: PCI device 0000:cc:01.2 on NUMA socket 1 00:06:07.891 EAL: probe driver: 8086:37c9 qat 00:06:07.891 EAL: PCI memory mapped at 0x202001004000 00:06:07.891 EAL: PCI memory mapped at 0x202001005000 00:06:07.891 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:07.891 EAL: PCI device 0000:cc:01.3 on NUMA socket 1 00:06:07.891 EAL: probe driver: 8086:37c9 qat 00:06:07.891 EAL: PCI memory mapped at 0x202001006000 00:06:07.891 EAL: PCI memory mapped at 0x202001007000 00:06:07.891 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:07.891 EAL: PCI device 0000:cc:01.4 on NUMA socket 1 00:06:07.891 EAL: probe driver: 8086:37c9 qat 00:06:07.891 EAL: PCI memory mapped at 0x202001008000 00:06:07.891 EAL: PCI memory mapped at 0x202001009000 00:06:07.891 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:07.891 EAL: PCI device 0000:cc:01.5 on NUMA socket 1 00:06:07.891 EAL: probe driver: 8086:37c9 qat 00:06:07.891 EAL: PCI memory mapped at 0x20200100a000 00:06:07.892 EAL: PCI memory mapped at 0x20200100b000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:01.6 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x20200100c000 00:06:07.892 EAL: PCI memory mapped at 0x20200100d000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:01.7 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x20200100e000 00:06:07.892 EAL: PCI memory mapped at 0x20200100f000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:02.0 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001010000 00:06:07.892 EAL: PCI memory mapped at 0x202001011000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:02.1 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001012000 00:06:07.892 EAL: PCI memory mapped at 0x202001013000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:02.2 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001014000 00:06:07.892 EAL: PCI memory mapped at 0x202001015000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:02.3 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001016000 00:06:07.892 EAL: PCI memory mapped at 0x202001017000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:02.4 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001018000 00:06:07.892 EAL: PCI memory mapped at 0x202001019000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:02.5 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x20200101a000 00:06:07.892 EAL: PCI memory mapped at 0x20200101b000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:02.6 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x20200101c000 00:06:07.892 EAL: PCI memory mapped at 0x20200101d000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:07.892 EAL: PCI device 0000:cc:02.7 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x20200101e000 00:06:07.892 EAL: PCI memory mapped at 0x20200101f000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:01.0 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001020000 00:06:07.892 EAL: PCI memory mapped at 0x202001021000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:01.1 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001022000 00:06:07.892 EAL: PCI memory mapped at 0x202001023000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:01.2 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001024000 00:06:07.892 EAL: PCI memory mapped at 0x202001025000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:01.3 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001026000 00:06:07.892 EAL: PCI memory mapped at 0x202001027000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:01.4 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001028000 00:06:07.892 EAL: PCI memory mapped at 0x202001029000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:01.5 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x20200102a000 00:06:07.892 EAL: PCI memory mapped at 0x20200102b000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:01.6 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x20200102c000 00:06:07.892 EAL: PCI memory mapped at 0x20200102d000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:01.7 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x20200102e000 00:06:07.892 EAL: PCI memory mapped at 0x20200102f000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:02.0 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001030000 00:06:07.892 EAL: PCI memory mapped at 0x202001031000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:02.1 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001032000 00:06:07.892 EAL: PCI memory mapped at 0x202001033000 00:06:07.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:07.892 EAL: PCI device 0000:ce:02.2 on NUMA socket 1 00:06:07.892 EAL: probe driver: 8086:37c9 qat 00:06:07.892 EAL: PCI memory mapped at 0x202001034000 00:06:08.153 EAL: PCI memory mapped at 0x202001035000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:08.153 EAL: PCI device 0000:ce:02.3 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001036000 00:06:08.153 EAL: PCI memory mapped at 0x202001037000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:08.153 EAL: PCI device 0000:ce:02.4 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001038000 00:06:08.153 EAL: PCI memory mapped at 0x202001039000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:08.153 EAL: PCI device 0000:ce:02.5 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200103a000 00:06:08.153 EAL: PCI memory mapped at 0x20200103b000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:08.153 EAL: PCI device 0000:ce:02.6 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200103c000 00:06:08.153 EAL: PCI memory mapped at 0x20200103d000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:08.153 EAL: PCI device 0000:ce:02.7 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200103e000 00:06:08.153 EAL: PCI memory mapped at 0x20200103f000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:01.0 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001040000 00:06:08.153 EAL: PCI memory mapped at 0x202001041000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:01.1 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001042000 00:06:08.153 EAL: PCI memory mapped at 0x202001043000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:01.2 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001044000 00:06:08.153 EAL: PCI memory mapped at 0x202001045000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:01.3 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001046000 00:06:08.153 EAL: PCI memory mapped at 0x202001047000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:01.4 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001048000 00:06:08.153 EAL: PCI memory mapped at 0x202001049000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:01.5 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200104a000 00:06:08.153 EAL: PCI memory mapped at 0x20200104b000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:01.6 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200104c000 00:06:08.153 EAL: PCI memory mapped at 0x20200104d000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:01.7 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200104e000 00:06:08.153 EAL: PCI memory mapped at 0x20200104f000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:02.0 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001050000 00:06:08.153 EAL: PCI memory mapped at 0x202001051000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:02.1 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001052000 00:06:08.153 EAL: PCI memory mapped at 0x202001053000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:02.2 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001054000 00:06:08.153 EAL: PCI memory mapped at 0x202001055000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:02.3 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001056000 00:06:08.153 EAL: PCI memory mapped at 0x202001057000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:02.4 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x202001058000 00:06:08.153 EAL: PCI memory mapped at 0x202001059000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:02.5 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200105a000 00:06:08.153 EAL: PCI memory mapped at 0x20200105b000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:02.6 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200105c000 00:06:08.153 EAL: PCI memory mapped at 0x20200105d000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:08.153 EAL: PCI device 0000:d0:02.7 on NUMA socket 1 00:06:08.153 EAL: probe driver: 8086:37c9 qat 00:06:08.153 EAL: PCI memory mapped at 0x20200105e000 00:06:08.153 EAL: PCI memory mapped at 0x20200105f000 00:06:08.153 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:08.153 EAL: No shared files mode enabled, IPC is disabled 00:06:08.153 EAL: No shared files mode enabled, IPC is disabled 00:06:08.153 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:08.153 EAL: Mem event callback 'spdk:(nil)' registered 00:06:08.153 00:06:08.153 00:06:08.153 CUnit - A unit testing framework for C - Version 2.1-3 00:06:08.153 http://cunit.sourceforge.net/ 00:06:08.153 00:06:08.153 00:06:08.153 Suite: components_suite 00:06:08.153 Test: vtophys_malloc_test ...passed 00:06:08.153 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:08.153 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.153 EAL: Restoring previous memory policy: 4 00:06:08.153 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.153 EAL: request: mp_malloc_sync 00:06:08.153 EAL: No shared files mode enabled, IPC is disabled 00:06:08.153 EAL: Heap on socket 0 was expanded by 4MB 00:06:08.153 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.153 EAL: request: mp_malloc_sync 00:06:08.153 EAL: No shared files mode enabled, IPC is disabled 00:06:08.153 EAL: Heap on socket 0 was shrunk by 4MB 00:06:08.153 EAL: Trying to obtain current memory policy. 00:06:08.153 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.153 EAL: Restoring previous memory policy: 4 00:06:08.153 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.153 EAL: request: mp_malloc_sync 00:06:08.153 EAL: No shared files mode enabled, IPC is disabled 00:06:08.153 EAL: Heap on socket 0 was expanded by 6MB 00:06:08.153 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.153 EAL: request: mp_malloc_sync 00:06:08.153 EAL: No shared files mode enabled, IPC is disabled 00:06:08.153 EAL: Heap on socket 0 was shrunk by 6MB 00:06:08.153 EAL: Trying to obtain current memory policy. 00:06:08.153 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.153 EAL: Restoring previous memory policy: 4 00:06:08.153 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.153 EAL: request: mp_malloc_sync 00:06:08.153 EAL: No shared files mode enabled, IPC is disabled 00:06:08.153 EAL: Heap on socket 0 was expanded by 10MB 00:06:08.153 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.153 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was shrunk by 10MB 00:06:08.154 EAL: Trying to obtain current memory policy. 00:06:08.154 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.154 EAL: Restoring previous memory policy: 4 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was expanded by 18MB 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was shrunk by 18MB 00:06:08.154 EAL: Trying to obtain current memory policy. 00:06:08.154 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.154 EAL: Restoring previous memory policy: 4 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was expanded by 34MB 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was shrunk by 34MB 00:06:08.154 EAL: Trying to obtain current memory policy. 00:06:08.154 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.154 EAL: Restoring previous memory policy: 4 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was expanded by 66MB 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was shrunk by 66MB 00:06:08.154 EAL: Trying to obtain current memory policy. 00:06:08.154 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.154 EAL: Restoring previous memory policy: 4 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was expanded by 130MB 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was shrunk by 130MB 00:06:08.154 EAL: Trying to obtain current memory policy. 00:06:08.154 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.154 EAL: Restoring previous memory policy: 4 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was expanded by 258MB 00:06:08.154 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.154 EAL: request: mp_malloc_sync 00:06:08.154 EAL: No shared files mode enabled, IPC is disabled 00:06:08.154 EAL: Heap on socket 0 was shrunk by 258MB 00:06:08.154 EAL: Trying to obtain current memory policy. 00:06:08.154 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.414 EAL: Restoring previous memory policy: 4 00:06:08.415 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.415 EAL: request: mp_malloc_sync 00:06:08.415 EAL: No shared files mode enabled, IPC is disabled 00:06:08.415 EAL: Heap on socket 0 was expanded by 514MB 00:06:08.415 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.415 EAL: request: mp_malloc_sync 00:06:08.415 EAL: No shared files mode enabled, IPC is disabled 00:06:08.415 EAL: Heap on socket 0 was shrunk by 514MB 00:06:08.415 EAL: Trying to obtain current memory policy. 00:06:08.415 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:08.674 EAL: Restoring previous memory policy: 4 00:06:08.674 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.674 EAL: request: mp_malloc_sync 00:06:08.674 EAL: No shared files mode enabled, IPC is disabled 00:06:08.674 EAL: Heap on socket 0 was expanded by 1026MB 00:06:08.674 EAL: Calling mem event callback 'spdk:(nil)' 00:06:08.674 EAL: request: mp_malloc_sync 00:06:08.674 EAL: No shared files mode enabled, IPC is disabled 00:06:08.674 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:08.674 passed 00:06:08.674 00:06:08.674 Run Summary: Type Total Ran Passed Failed Inactive 00:06:08.674 suites 1 1 n/a 0 0 00:06:08.674 tests 2 2 2 0 0 00:06:08.674 asserts 6653 6653 6653 0 n/a 00:06:08.674 00:06:08.674 Elapsed time = 0.671 seconds 00:06:08.674 EAL: No shared files mode enabled, IPC is disabled 00:06:08.674 EAL: No shared files mode enabled, IPC is disabled 00:06:08.674 EAL: No shared files mode enabled, IPC is disabled 00:06:08.674 00:06:08.674 real 0m0.821s 00:06:08.674 user 0m0.424s 00:06:08.674 sys 0m0.373s 00:06:08.674 07:41:53 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:08.674 07:41:53 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:08.674 ************************************ 00:06:08.674 END TEST env_vtophys 00:06:08.674 ************************************ 00:06:08.674 07:41:53 env -- common/autotest_common.sh@1142 -- # return 0 00:06:08.674 07:41:53 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:08.674 07:41:53 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:08.674 07:41:53 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.674 07:41:53 env -- common/autotest_common.sh@10 -- # set +x 00:06:08.935 ************************************ 00:06:08.935 START TEST env_pci 00:06:08.935 ************************************ 00:06:08.935 07:41:53 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:08.935 00:06:08.935 00:06:08.935 CUnit - A unit testing framework for C - Version 2.1-3 00:06:08.935 http://cunit.sourceforge.net/ 00:06:08.935 00:06:08.935 00:06:08.935 Suite: pci 00:06:08.935 Test: pci_hook ...[2024-07-15 07:41:53.491142] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1537298 has claimed it 00:06:08.935 EAL: Cannot find device (10000:00:01.0) 00:06:08.935 EAL: Failed to attach device on primary process 00:06:08.935 passed 00:06:08.935 00:06:08.935 Run Summary: Type Total Ran Passed Failed Inactive 00:06:08.935 suites 1 1 n/a 0 0 00:06:08.935 tests 1 1 1 0 0 00:06:08.935 asserts 25 25 25 0 n/a 00:06:08.935 00:06:08.935 Elapsed time = 0.031 seconds 00:06:08.935 00:06:08.935 real 0m0.057s 00:06:08.935 user 0m0.013s 00:06:08.935 sys 0m0.044s 00:06:08.935 07:41:53 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:08.935 07:41:53 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:08.935 ************************************ 00:06:08.935 END TEST env_pci 00:06:08.935 ************************************ 00:06:08.935 07:41:53 env -- common/autotest_common.sh@1142 -- # return 0 00:06:08.935 07:41:53 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:08.935 07:41:53 env -- env/env.sh@15 -- # uname 00:06:08.935 07:41:53 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:08.935 07:41:53 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:08.935 07:41:53 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:08.935 07:41:53 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:06:08.935 07:41:53 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.935 07:41:53 env -- common/autotest_common.sh@10 -- # set +x 00:06:08.935 ************************************ 00:06:08.935 START TEST env_dpdk_post_init 00:06:08.935 ************************************ 00:06:08.935 07:41:53 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:08.935 EAL: Detected CPU lcores: 128 00:06:08.935 EAL: Detected NUMA nodes: 2 00:06:08.935 EAL: Detected shared linkage of DPDK 00:06:08.935 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:08.935 EAL: Selected IOVA mode 'PA' 00:06:08.935 EAL: VFIO support initialized 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:06:08.935 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.935 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:08.935 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:08.936 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:08.936 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:06:08.936 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:09.197 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:06:09.197 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.197 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:06:09.197 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.197 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:09.197 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:06:09.197 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:09.197 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:06:09.197 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:09.197 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:09.197 EAL: Using IOMMU type 1 (Type 1) 00:06:09.197 EAL: Ignore mapping IO port bar(1) 00:06:09.458 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.0 (socket 0) 00:06:09.458 EAL: Ignore mapping IO port bar(1) 00:06:09.458 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.1 (socket 0) 00:06:09.720 EAL: Ignore mapping IO port bar(1) 00:06:09.720 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.2 (socket 0) 00:06:09.981 EAL: Ignore mapping IO port bar(1) 00:06:09.981 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.3 (socket 0) 00:06:10.242 EAL: Ignore mapping IO port bar(1) 00:06:10.242 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.4 (socket 0) 00:06:10.503 EAL: Ignore mapping IO port bar(1) 00:06:10.503 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.5 (socket 0) 00:06:10.503 EAL: Ignore mapping IO port bar(1) 00:06:10.792 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.6 (socket 0) 00:06:10.792 EAL: Ignore mapping IO port bar(1) 00:06:11.052 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.7 (socket 0) 00:06:11.624 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:65:00.0 (socket 0) 00:06:11.885 EAL: Ignore mapping IO port bar(1) 00:06:11.885 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.0 (socket 1) 00:06:11.885 EAL: Ignore mapping IO port bar(1) 00:06:12.145 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.1 (socket 1) 00:06:12.145 EAL: Ignore mapping IO port bar(1) 00:06:12.406 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.2 (socket 1) 00:06:12.406 EAL: Ignore mapping IO port bar(1) 00:06:12.667 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.3 (socket 1) 00:06:12.667 EAL: Ignore mapping IO port bar(1) 00:06:12.667 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.4 (socket 1) 00:06:12.926 EAL: Ignore mapping IO port bar(1) 00:06:12.926 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.5 (socket 1) 00:06:13.187 EAL: Ignore mapping IO port bar(1) 00:06:13.187 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.6 (socket 1) 00:06:13.448 EAL: Ignore mapping IO port bar(1) 00:06:13.448 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.7 (socket 1) 00:06:17.690 EAL: Releasing PCI mapped resource for 0000:65:00.0 00:06:17.690 EAL: Calling pci_unmap_resource for 0000:65:00.0 at 0x202001080000 00:06:17.690 Starting DPDK initialization... 00:06:17.690 Starting SPDK post initialization... 00:06:17.690 SPDK NVMe probe 00:06:17.690 Attaching to 0000:65:00.0 00:06:17.690 Attached to 0000:65:00.0 00:06:17.690 Cleaning up... 00:06:19.602 00:06:19.602 real 0m10.371s 00:06:19.602 user 0m4.229s 00:06:19.602 sys 0m0.164s 00:06:19.602 07:42:03 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.602 07:42:03 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:19.602 ************************************ 00:06:19.602 END TEST env_dpdk_post_init 00:06:19.602 ************************************ 00:06:19.602 07:42:04 env -- common/autotest_common.sh@1142 -- # return 0 00:06:19.602 07:42:04 env -- env/env.sh@26 -- # uname 00:06:19.602 07:42:04 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:19.602 07:42:04 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:19.602 07:42:04 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:19.602 07:42:04 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.602 07:42:04 env -- common/autotest_common.sh@10 -- # set +x 00:06:19.602 ************************************ 00:06:19.602 START TEST env_mem_callbacks 00:06:19.602 ************************************ 00:06:19.602 07:42:04 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:19.602 EAL: Detected CPU lcores: 128 00:06:19.602 EAL: Detected NUMA nodes: 2 00:06:19.602 EAL: Detected shared linkage of DPDK 00:06:19.602 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:19.602 EAL: Selected IOVA mode 'PA' 00:06:19.602 EAL: VFIO support initialized 00:06:19.602 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:06:19.602 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:06:19.602 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.602 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:06:19.602 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:06:19.602 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.602 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:06:19.602 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:06:19.602 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.602 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:06:19.602 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:06:19.602 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.602 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:19.602 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.603 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.603 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:06:19.603 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:19.604 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:06:19.604 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:19.604 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:19.604 00:06:19.604 00:06:19.604 CUnit - A unit testing framework for C - Version 2.1-3 00:06:19.604 http://cunit.sourceforge.net/ 00:06:19.604 00:06:19.604 00:06:19.604 Suite: memory 00:06:19.604 Test: test ... 00:06:19.604 register 0x200000200000 2097152 00:06:19.604 register 0x201000a00000 2097152 00:06:19.604 malloc 3145728 00:06:19.604 register 0x200000400000 4194304 00:06:19.604 buf 0x200000500000 len 3145728 PASSED 00:06:19.604 malloc 64 00:06:19.604 buf 0x2000004fff40 len 64 PASSED 00:06:19.604 malloc 4194304 00:06:19.604 register 0x200000800000 6291456 00:06:19.604 buf 0x200000a00000 len 4194304 PASSED 00:06:19.604 free 0x200000500000 3145728 00:06:19.604 free 0x2000004fff40 64 00:06:19.604 unregister 0x200000400000 4194304 PASSED 00:06:19.604 free 0x200000a00000 4194304 00:06:19.604 unregister 0x200000800000 6291456 PASSED 00:06:19.604 malloc 8388608 00:06:19.604 register 0x200000400000 10485760 00:06:19.604 buf 0x200000600000 len 8388608 PASSED 00:06:19.604 free 0x200000600000 8388608 00:06:19.604 unregister 0x200000400000 10485760 PASSED 00:06:19.604 passed 00:06:19.604 00:06:19.604 Run Summary: Type Total Ran Passed Failed Inactive 00:06:19.604 suites 1 1 n/a 0 0 00:06:19.604 tests 1 1 1 0 0 00:06:19.604 asserts 16 16 16 0 n/a 00:06:19.604 00:06:19.604 Elapsed time = 0.008 seconds 00:06:19.604 00:06:19.604 real 0m0.089s 00:06:19.604 user 0m0.028s 00:06:19.604 sys 0m0.060s 00:06:19.604 07:42:04 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.604 07:42:04 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:19.604 ************************************ 00:06:19.604 END TEST env_mem_callbacks 00:06:19.604 ************************************ 00:06:19.604 07:42:04 env -- common/autotest_common.sh@1142 -- # return 0 00:06:19.604 00:06:19.604 real 0m12.046s 00:06:19.604 user 0m5.057s 00:06:19.604 sys 0m1.014s 00:06:19.604 07:42:04 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:19.604 07:42:04 env -- common/autotest_common.sh@10 -- # set +x 00:06:19.604 ************************************ 00:06:19.604 END TEST env 00:06:19.604 ************************************ 00:06:19.604 07:42:04 -- common/autotest_common.sh@1142 -- # return 0 00:06:19.604 07:42:04 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:19.604 07:42:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:19.604 07:42:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.604 07:42:04 -- common/autotest_common.sh@10 -- # set +x 00:06:19.604 ************************************ 00:06:19.604 START TEST rpc 00:06:19.604 ************************************ 00:06:19.604 07:42:04 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:19.865 * Looking for test storage... 00:06:19.865 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:19.865 07:42:04 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1539358 00:06:19.865 07:42:04 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:19.865 07:42:04 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:19.865 07:42:04 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1539358 00:06:19.865 07:42:04 rpc -- common/autotest_common.sh@829 -- # '[' -z 1539358 ']' 00:06:19.865 07:42:04 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.865 07:42:04 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.865 07:42:04 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.865 07:42:04 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.865 07:42:04 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:19.865 [2024-07-15 07:42:04.444092] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:06:19.865 [2024-07-15 07:42:04.444145] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1539358 ] 00:06:19.865 [2024-07-15 07:42:04.534788] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.865 [2024-07-15 07:42:04.600663] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:19.865 [2024-07-15 07:42:04.600695] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1539358' to capture a snapshot of events at runtime. 00:06:19.865 [2024-07-15 07:42:04.600702] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:19.865 [2024-07-15 07:42:04.600716] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:19.865 [2024-07-15 07:42:04.600722] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1539358 for offline analysis/debug. 00:06:19.865 [2024-07-15 07:42:04.600740] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.807 07:42:05 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:20.807 07:42:05 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:20.807 07:42:05 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:20.807 07:42:05 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:20.807 07:42:05 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:20.807 07:42:05 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:20.807 07:42:05 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:20.807 07:42:05 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:20.807 07:42:05 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:20.807 ************************************ 00:06:20.807 START TEST rpc_integrity 00:06:20.807 ************************************ 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:20.807 { 00:06:20.807 "name": "Malloc0", 00:06:20.807 "aliases": [ 00:06:20.807 "bd2e6e3f-1b1b-49bf-83af-6d4d73e0baf4" 00:06:20.807 ], 00:06:20.807 "product_name": "Malloc disk", 00:06:20.807 "block_size": 512, 00:06:20.807 "num_blocks": 16384, 00:06:20.807 "uuid": "bd2e6e3f-1b1b-49bf-83af-6d4d73e0baf4", 00:06:20.807 "assigned_rate_limits": { 00:06:20.807 "rw_ios_per_sec": 0, 00:06:20.807 "rw_mbytes_per_sec": 0, 00:06:20.807 "r_mbytes_per_sec": 0, 00:06:20.807 "w_mbytes_per_sec": 0 00:06:20.807 }, 00:06:20.807 "claimed": false, 00:06:20.807 "zoned": false, 00:06:20.807 "supported_io_types": { 00:06:20.807 "read": true, 00:06:20.807 "write": true, 00:06:20.807 "unmap": true, 00:06:20.807 "flush": true, 00:06:20.807 "reset": true, 00:06:20.807 "nvme_admin": false, 00:06:20.807 "nvme_io": false, 00:06:20.807 "nvme_io_md": false, 00:06:20.807 "write_zeroes": true, 00:06:20.807 "zcopy": true, 00:06:20.807 "get_zone_info": false, 00:06:20.807 "zone_management": false, 00:06:20.807 "zone_append": false, 00:06:20.807 "compare": false, 00:06:20.807 "compare_and_write": false, 00:06:20.807 "abort": true, 00:06:20.807 "seek_hole": false, 00:06:20.807 "seek_data": false, 00:06:20.807 "copy": true, 00:06:20.807 "nvme_iov_md": false 00:06:20.807 }, 00:06:20.807 "memory_domains": [ 00:06:20.807 { 00:06:20.807 "dma_device_id": "system", 00:06:20.807 "dma_device_type": 1 00:06:20.807 }, 00:06:20.807 { 00:06:20.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:20.807 "dma_device_type": 2 00:06:20.807 } 00:06:20.807 ], 00:06:20.807 "driver_specific": {} 00:06:20.807 } 00:06:20.807 ]' 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.807 [2024-07-15 07:42:05.432734] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:20.807 [2024-07-15 07:42:05.432764] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:20.807 [2024-07-15 07:42:05.432776] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeece00 00:06:20.807 [2024-07-15 07:42:05.432787] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:20.807 [2024-07-15 07:42:05.434053] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:20.807 [2024-07-15 07:42:05.434072] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:20.807 Passthru0 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.807 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.807 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:20.807 { 00:06:20.807 "name": "Malloc0", 00:06:20.807 "aliases": [ 00:06:20.807 "bd2e6e3f-1b1b-49bf-83af-6d4d73e0baf4" 00:06:20.807 ], 00:06:20.807 "product_name": "Malloc disk", 00:06:20.807 "block_size": 512, 00:06:20.807 "num_blocks": 16384, 00:06:20.807 "uuid": "bd2e6e3f-1b1b-49bf-83af-6d4d73e0baf4", 00:06:20.807 "assigned_rate_limits": { 00:06:20.807 "rw_ios_per_sec": 0, 00:06:20.807 "rw_mbytes_per_sec": 0, 00:06:20.807 "r_mbytes_per_sec": 0, 00:06:20.807 "w_mbytes_per_sec": 0 00:06:20.807 }, 00:06:20.808 "claimed": true, 00:06:20.808 "claim_type": "exclusive_write", 00:06:20.808 "zoned": false, 00:06:20.808 "supported_io_types": { 00:06:20.808 "read": true, 00:06:20.808 "write": true, 00:06:20.808 "unmap": true, 00:06:20.808 "flush": true, 00:06:20.808 "reset": true, 00:06:20.808 "nvme_admin": false, 00:06:20.808 "nvme_io": false, 00:06:20.808 "nvme_io_md": false, 00:06:20.808 "write_zeroes": true, 00:06:20.808 "zcopy": true, 00:06:20.808 "get_zone_info": false, 00:06:20.808 "zone_management": false, 00:06:20.808 "zone_append": false, 00:06:20.808 "compare": false, 00:06:20.808 "compare_and_write": false, 00:06:20.808 "abort": true, 00:06:20.808 "seek_hole": false, 00:06:20.808 "seek_data": false, 00:06:20.808 "copy": true, 00:06:20.808 "nvme_iov_md": false 00:06:20.808 }, 00:06:20.808 "memory_domains": [ 00:06:20.808 { 00:06:20.808 "dma_device_id": "system", 00:06:20.808 "dma_device_type": 1 00:06:20.808 }, 00:06:20.808 { 00:06:20.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:20.808 "dma_device_type": 2 00:06:20.808 } 00:06:20.808 ], 00:06:20.808 "driver_specific": {} 00:06:20.808 }, 00:06:20.808 { 00:06:20.808 "name": "Passthru0", 00:06:20.808 "aliases": [ 00:06:20.808 "d2657def-ee3a-50fe-af0e-b84a74256d87" 00:06:20.808 ], 00:06:20.808 "product_name": "passthru", 00:06:20.808 "block_size": 512, 00:06:20.808 "num_blocks": 16384, 00:06:20.808 "uuid": "d2657def-ee3a-50fe-af0e-b84a74256d87", 00:06:20.808 "assigned_rate_limits": { 00:06:20.808 "rw_ios_per_sec": 0, 00:06:20.808 "rw_mbytes_per_sec": 0, 00:06:20.808 "r_mbytes_per_sec": 0, 00:06:20.808 "w_mbytes_per_sec": 0 00:06:20.808 }, 00:06:20.808 "claimed": false, 00:06:20.808 "zoned": false, 00:06:20.808 "supported_io_types": { 00:06:20.808 "read": true, 00:06:20.808 "write": true, 00:06:20.808 "unmap": true, 00:06:20.808 "flush": true, 00:06:20.808 "reset": true, 00:06:20.808 "nvme_admin": false, 00:06:20.808 "nvme_io": false, 00:06:20.808 "nvme_io_md": false, 00:06:20.808 "write_zeroes": true, 00:06:20.808 "zcopy": true, 00:06:20.808 "get_zone_info": false, 00:06:20.808 "zone_management": false, 00:06:20.808 "zone_append": false, 00:06:20.808 "compare": false, 00:06:20.808 "compare_and_write": false, 00:06:20.808 "abort": true, 00:06:20.808 "seek_hole": false, 00:06:20.808 "seek_data": false, 00:06:20.808 "copy": true, 00:06:20.808 "nvme_iov_md": false 00:06:20.808 }, 00:06:20.808 "memory_domains": [ 00:06:20.808 { 00:06:20.808 "dma_device_id": "system", 00:06:20.808 "dma_device_type": 1 00:06:20.808 }, 00:06:20.808 { 00:06:20.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:20.808 "dma_device_type": 2 00:06:20.808 } 00:06:20.808 ], 00:06:20.808 "driver_specific": { 00:06:20.808 "passthru": { 00:06:20.808 "name": "Passthru0", 00:06:20.808 "base_bdev_name": "Malloc0" 00:06:20.808 } 00:06:20.808 } 00:06:20.808 } 00:06:20.808 ]' 00:06:20.808 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:20.808 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:20.808 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.808 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.808 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:20.808 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:20.808 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:20.808 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:21.069 07:42:05 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:21.069 00:06:21.069 real 0m0.301s 00:06:21.069 user 0m0.192s 00:06:21.069 sys 0m0.039s 00:06:21.069 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:21.069 07:42:05 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.069 ************************************ 00:06:21.069 END TEST rpc_integrity 00:06:21.069 ************************************ 00:06:21.069 07:42:05 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:21.069 07:42:05 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:21.069 07:42:05 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:21.069 07:42:05 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.069 07:42:05 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:21.069 ************************************ 00:06:21.069 START TEST rpc_plugins 00:06:21.069 ************************************ 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:21.069 { 00:06:21.069 "name": "Malloc1", 00:06:21.069 "aliases": [ 00:06:21.069 "9e375e90-f319-4f44-82ee-7b779be014b7" 00:06:21.069 ], 00:06:21.069 "product_name": "Malloc disk", 00:06:21.069 "block_size": 4096, 00:06:21.069 "num_blocks": 256, 00:06:21.069 "uuid": "9e375e90-f319-4f44-82ee-7b779be014b7", 00:06:21.069 "assigned_rate_limits": { 00:06:21.069 "rw_ios_per_sec": 0, 00:06:21.069 "rw_mbytes_per_sec": 0, 00:06:21.069 "r_mbytes_per_sec": 0, 00:06:21.069 "w_mbytes_per_sec": 0 00:06:21.069 }, 00:06:21.069 "claimed": false, 00:06:21.069 "zoned": false, 00:06:21.069 "supported_io_types": { 00:06:21.069 "read": true, 00:06:21.069 "write": true, 00:06:21.069 "unmap": true, 00:06:21.069 "flush": true, 00:06:21.069 "reset": true, 00:06:21.069 "nvme_admin": false, 00:06:21.069 "nvme_io": false, 00:06:21.069 "nvme_io_md": false, 00:06:21.069 "write_zeroes": true, 00:06:21.069 "zcopy": true, 00:06:21.069 "get_zone_info": false, 00:06:21.069 "zone_management": false, 00:06:21.069 "zone_append": false, 00:06:21.069 "compare": false, 00:06:21.069 "compare_and_write": false, 00:06:21.069 "abort": true, 00:06:21.069 "seek_hole": false, 00:06:21.069 "seek_data": false, 00:06:21.069 "copy": true, 00:06:21.069 "nvme_iov_md": false 00:06:21.069 }, 00:06:21.069 "memory_domains": [ 00:06:21.069 { 00:06:21.069 "dma_device_id": "system", 00:06:21.069 "dma_device_type": 1 00:06:21.069 }, 00:06:21.069 { 00:06:21.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.069 "dma_device_type": 2 00:06:21.069 } 00:06:21.069 ], 00:06:21.069 "driver_specific": {} 00:06:21.069 } 00:06:21.069 ]' 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:21.069 07:42:05 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:21.069 00:06:21.069 real 0m0.152s 00:06:21.069 user 0m0.096s 00:06:21.069 sys 0m0.020s 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:21.069 07:42:05 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:21.069 ************************************ 00:06:21.069 END TEST rpc_plugins 00:06:21.069 ************************************ 00:06:21.331 07:42:05 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:21.331 07:42:05 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:21.331 07:42:05 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:21.331 07:42:05 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.331 07:42:05 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:21.331 ************************************ 00:06:21.331 START TEST rpc_trace_cmd_test 00:06:21.331 ************************************ 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:21.331 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1539358", 00:06:21.331 "tpoint_group_mask": "0x8", 00:06:21.331 "iscsi_conn": { 00:06:21.331 "mask": "0x2", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "scsi": { 00:06:21.331 "mask": "0x4", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "bdev": { 00:06:21.331 "mask": "0x8", 00:06:21.331 "tpoint_mask": "0xffffffffffffffff" 00:06:21.331 }, 00:06:21.331 "nvmf_rdma": { 00:06:21.331 "mask": "0x10", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "nvmf_tcp": { 00:06:21.331 "mask": "0x20", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "ftl": { 00:06:21.331 "mask": "0x40", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "blobfs": { 00:06:21.331 "mask": "0x80", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "dsa": { 00:06:21.331 "mask": "0x200", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "thread": { 00:06:21.331 "mask": "0x400", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "nvme_pcie": { 00:06:21.331 "mask": "0x800", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "iaa": { 00:06:21.331 "mask": "0x1000", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "nvme_tcp": { 00:06:21.331 "mask": "0x2000", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "bdev_nvme": { 00:06:21.331 "mask": "0x4000", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 }, 00:06:21.331 "sock": { 00:06:21.331 "mask": "0x8000", 00:06:21.331 "tpoint_mask": "0x0" 00:06:21.331 } 00:06:21.331 }' 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:21.331 07:42:05 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:21.331 07:42:06 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:21.331 07:42:06 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:21.331 07:42:06 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:21.591 07:42:06 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:21.591 07:42:06 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:21.591 00:06:21.591 real 0m0.247s 00:06:21.591 user 0m0.210s 00:06:21.591 sys 0m0.028s 00:06:21.591 07:42:06 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:21.591 07:42:06 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:21.591 ************************************ 00:06:21.591 END TEST rpc_trace_cmd_test 00:06:21.591 ************************************ 00:06:21.591 07:42:06 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:21.591 07:42:06 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:21.591 07:42:06 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:21.591 07:42:06 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:21.591 07:42:06 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:21.591 07:42:06 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.591 07:42:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:21.591 ************************************ 00:06:21.591 START TEST rpc_daemon_integrity 00:06:21.591 ************************************ 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.591 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:21.591 { 00:06:21.591 "name": "Malloc2", 00:06:21.591 "aliases": [ 00:06:21.591 "31a01d61-3ed8-42a4-a6cd-3b6c70afdd47" 00:06:21.591 ], 00:06:21.591 "product_name": "Malloc disk", 00:06:21.591 "block_size": 512, 00:06:21.591 "num_blocks": 16384, 00:06:21.591 "uuid": "31a01d61-3ed8-42a4-a6cd-3b6c70afdd47", 00:06:21.591 "assigned_rate_limits": { 00:06:21.591 "rw_ios_per_sec": 0, 00:06:21.591 "rw_mbytes_per_sec": 0, 00:06:21.591 "r_mbytes_per_sec": 0, 00:06:21.591 "w_mbytes_per_sec": 0 00:06:21.591 }, 00:06:21.591 "claimed": false, 00:06:21.591 "zoned": false, 00:06:21.591 "supported_io_types": { 00:06:21.591 "read": true, 00:06:21.591 "write": true, 00:06:21.591 "unmap": true, 00:06:21.591 "flush": true, 00:06:21.591 "reset": true, 00:06:21.591 "nvme_admin": false, 00:06:21.591 "nvme_io": false, 00:06:21.591 "nvme_io_md": false, 00:06:21.591 "write_zeroes": true, 00:06:21.591 "zcopy": true, 00:06:21.591 "get_zone_info": false, 00:06:21.591 "zone_management": false, 00:06:21.591 "zone_append": false, 00:06:21.591 "compare": false, 00:06:21.591 "compare_and_write": false, 00:06:21.591 "abort": true, 00:06:21.591 "seek_hole": false, 00:06:21.591 "seek_data": false, 00:06:21.591 "copy": true, 00:06:21.591 "nvme_iov_md": false 00:06:21.591 }, 00:06:21.591 "memory_domains": [ 00:06:21.591 { 00:06:21.591 "dma_device_id": "system", 00:06:21.591 "dma_device_type": 1 00:06:21.591 }, 00:06:21.591 { 00:06:21.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.592 "dma_device_type": 2 00:06:21.592 } 00:06:21.592 ], 00:06:21.592 "driver_specific": {} 00:06:21.592 } 00:06:21.592 ]' 00:06:21.592 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:21.592 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:21.592 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.852 [2024-07-15 07:42:06.351194] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:21.852 [2024-07-15 07:42:06.351220] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:21.852 [2024-07-15 07:42:06.351232] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1090680 00:06:21.852 [2024-07-15 07:42:06.351239] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:21.852 [2024-07-15 07:42:06.352385] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:21.852 [2024-07-15 07:42:06.352404] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:21.852 Passthru0 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:21.852 { 00:06:21.852 "name": "Malloc2", 00:06:21.852 "aliases": [ 00:06:21.852 "31a01d61-3ed8-42a4-a6cd-3b6c70afdd47" 00:06:21.852 ], 00:06:21.852 "product_name": "Malloc disk", 00:06:21.852 "block_size": 512, 00:06:21.852 "num_blocks": 16384, 00:06:21.852 "uuid": "31a01d61-3ed8-42a4-a6cd-3b6c70afdd47", 00:06:21.852 "assigned_rate_limits": { 00:06:21.852 "rw_ios_per_sec": 0, 00:06:21.852 "rw_mbytes_per_sec": 0, 00:06:21.852 "r_mbytes_per_sec": 0, 00:06:21.852 "w_mbytes_per_sec": 0 00:06:21.852 }, 00:06:21.852 "claimed": true, 00:06:21.852 "claim_type": "exclusive_write", 00:06:21.852 "zoned": false, 00:06:21.852 "supported_io_types": { 00:06:21.852 "read": true, 00:06:21.852 "write": true, 00:06:21.852 "unmap": true, 00:06:21.852 "flush": true, 00:06:21.852 "reset": true, 00:06:21.852 "nvme_admin": false, 00:06:21.852 "nvme_io": false, 00:06:21.852 "nvme_io_md": false, 00:06:21.852 "write_zeroes": true, 00:06:21.852 "zcopy": true, 00:06:21.852 "get_zone_info": false, 00:06:21.852 "zone_management": false, 00:06:21.852 "zone_append": false, 00:06:21.852 "compare": false, 00:06:21.852 "compare_and_write": false, 00:06:21.852 "abort": true, 00:06:21.852 "seek_hole": false, 00:06:21.852 "seek_data": false, 00:06:21.852 "copy": true, 00:06:21.852 "nvme_iov_md": false 00:06:21.852 }, 00:06:21.852 "memory_domains": [ 00:06:21.852 { 00:06:21.852 "dma_device_id": "system", 00:06:21.852 "dma_device_type": 1 00:06:21.852 }, 00:06:21.852 { 00:06:21.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.852 "dma_device_type": 2 00:06:21.852 } 00:06:21.852 ], 00:06:21.852 "driver_specific": {} 00:06:21.852 }, 00:06:21.852 { 00:06:21.852 "name": "Passthru0", 00:06:21.852 "aliases": [ 00:06:21.852 "756ed5ba-a75c-5648-9484-89f5c4bd10be" 00:06:21.852 ], 00:06:21.852 "product_name": "passthru", 00:06:21.852 "block_size": 512, 00:06:21.852 "num_blocks": 16384, 00:06:21.852 "uuid": "756ed5ba-a75c-5648-9484-89f5c4bd10be", 00:06:21.852 "assigned_rate_limits": { 00:06:21.852 "rw_ios_per_sec": 0, 00:06:21.852 "rw_mbytes_per_sec": 0, 00:06:21.852 "r_mbytes_per_sec": 0, 00:06:21.852 "w_mbytes_per_sec": 0 00:06:21.852 }, 00:06:21.852 "claimed": false, 00:06:21.852 "zoned": false, 00:06:21.852 "supported_io_types": { 00:06:21.852 "read": true, 00:06:21.852 "write": true, 00:06:21.852 "unmap": true, 00:06:21.852 "flush": true, 00:06:21.852 "reset": true, 00:06:21.852 "nvme_admin": false, 00:06:21.852 "nvme_io": false, 00:06:21.852 "nvme_io_md": false, 00:06:21.852 "write_zeroes": true, 00:06:21.852 "zcopy": true, 00:06:21.852 "get_zone_info": false, 00:06:21.852 "zone_management": false, 00:06:21.852 "zone_append": false, 00:06:21.852 "compare": false, 00:06:21.852 "compare_and_write": false, 00:06:21.852 "abort": true, 00:06:21.852 "seek_hole": false, 00:06:21.852 "seek_data": false, 00:06:21.852 "copy": true, 00:06:21.852 "nvme_iov_md": false 00:06:21.852 }, 00:06:21.852 "memory_domains": [ 00:06:21.852 { 00:06:21.852 "dma_device_id": "system", 00:06:21.852 "dma_device_type": 1 00:06:21.852 }, 00:06:21.852 { 00:06:21.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:21.852 "dma_device_type": 2 00:06:21.852 } 00:06:21.852 ], 00:06:21.852 "driver_specific": { 00:06:21.852 "passthru": { 00:06:21.852 "name": "Passthru0", 00:06:21.852 "base_bdev_name": "Malloc2" 00:06:21.852 } 00:06:21.852 } 00:06:21.852 } 00:06:21.852 ]' 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:21.852 07:42:06 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:21.853 00:06:21.853 real 0m0.296s 00:06:21.853 user 0m0.188s 00:06:21.853 sys 0m0.044s 00:06:21.853 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:21.853 07:42:06 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:21.853 ************************************ 00:06:21.853 END TEST rpc_daemon_integrity 00:06:21.853 ************************************ 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:21.853 07:42:06 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:21.853 07:42:06 rpc -- rpc/rpc.sh@84 -- # killprocess 1539358 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@948 -- # '[' -z 1539358 ']' 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@952 -- # kill -0 1539358 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@953 -- # uname 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1539358 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1539358' 00:06:21.853 killing process with pid 1539358 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@967 -- # kill 1539358 00:06:21.853 07:42:06 rpc -- common/autotest_common.sh@972 -- # wait 1539358 00:06:22.113 00:06:22.113 real 0m2.517s 00:06:22.113 user 0m3.341s 00:06:22.113 sys 0m0.696s 00:06:22.113 07:42:06 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:22.114 07:42:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.114 ************************************ 00:06:22.114 END TEST rpc 00:06:22.114 ************************************ 00:06:22.114 07:42:06 -- common/autotest_common.sh@1142 -- # return 0 00:06:22.114 07:42:06 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:22.114 07:42:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:22.114 07:42:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.114 07:42:06 -- common/autotest_common.sh@10 -- # set +x 00:06:22.114 ************************************ 00:06:22.114 START TEST skip_rpc 00:06:22.114 ************************************ 00:06:22.114 07:42:06 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:22.375 * Looking for test storage... 00:06:22.375 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:22.375 07:42:06 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:22.375 07:42:06 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:22.375 07:42:06 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:22.375 07:42:06 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:22.375 07:42:06 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.375 07:42:06 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:22.375 ************************************ 00:06:22.375 START TEST skip_rpc 00:06:22.375 ************************************ 00:06:22.375 07:42:07 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:22.375 07:42:07 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1539899 00:06:22.375 07:42:07 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:22.375 07:42:07 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:22.375 07:42:07 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:22.375 [2024-07-15 07:42:07.070154] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:06:22.375 [2024-07-15 07:42:07.070209] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1539899 ] 00:06:22.636 [2024-07-15 07:42:07.162828] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.636 [2024-07-15 07:42:07.239925] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1539899 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 1539899 ']' 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 1539899 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1539899 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1539899' 00:06:27.920 killing process with pid 1539899 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 1539899 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 1539899 00:06:27.920 00:06:27.920 real 0m5.273s 00:06:27.920 user 0m5.026s 00:06:27.920 sys 0m0.265s 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:27.920 07:42:12 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.920 ************************************ 00:06:27.920 END TEST skip_rpc 00:06:27.920 ************************************ 00:06:27.920 07:42:12 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:27.920 07:42:12 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:27.920 07:42:12 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:27.920 07:42:12 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.920 07:42:12 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.920 ************************************ 00:06:27.920 START TEST skip_rpc_with_json 00:06:27.920 ************************************ 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1541262 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1541262 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 1541262 ']' 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:27.920 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:27.920 07:42:12 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:27.920 [2024-07-15 07:42:12.417913] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:06:27.920 [2024-07-15 07:42:12.417970] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1541262 ] 00:06:27.920 [2024-07-15 07:42:12.509730] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.920 [2024-07-15 07:42:12.586650] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:28.490 [2024-07-15 07:42:13.223860] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:28.490 request: 00:06:28.490 { 00:06:28.490 "trtype": "tcp", 00:06:28.490 "method": "nvmf_get_transports", 00:06:28.490 "req_id": 1 00:06:28.490 } 00:06:28.490 Got JSON-RPC error response 00:06:28.490 response: 00:06:28.490 { 00:06:28.490 "code": -19, 00:06:28.490 "message": "No such device" 00:06:28.490 } 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:28.490 [2024-07-15 07:42:13.235978] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.490 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:28.750 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.750 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:28.750 { 00:06:28.750 "subsystems": [ 00:06:28.750 { 00:06:28.750 "subsystem": "keyring", 00:06:28.750 "config": [] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "iobuf", 00:06:28.750 "config": [ 00:06:28.750 { 00:06:28.750 "method": "iobuf_set_options", 00:06:28.750 "params": { 00:06:28.750 "small_pool_count": 8192, 00:06:28.750 "large_pool_count": 1024, 00:06:28.750 "small_bufsize": 8192, 00:06:28.750 "large_bufsize": 135168 00:06:28.750 } 00:06:28.750 } 00:06:28.750 ] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "sock", 00:06:28.750 "config": [ 00:06:28.750 { 00:06:28.750 "method": "sock_set_default_impl", 00:06:28.750 "params": { 00:06:28.750 "impl_name": "posix" 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "sock_impl_set_options", 00:06:28.750 "params": { 00:06:28.750 "impl_name": "ssl", 00:06:28.750 "recv_buf_size": 4096, 00:06:28.750 "send_buf_size": 4096, 00:06:28.750 "enable_recv_pipe": true, 00:06:28.750 "enable_quickack": false, 00:06:28.750 "enable_placement_id": 0, 00:06:28.750 "enable_zerocopy_send_server": true, 00:06:28.750 "enable_zerocopy_send_client": false, 00:06:28.750 "zerocopy_threshold": 0, 00:06:28.750 "tls_version": 0, 00:06:28.750 "enable_ktls": false 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "sock_impl_set_options", 00:06:28.750 "params": { 00:06:28.750 "impl_name": "posix", 00:06:28.750 "recv_buf_size": 2097152, 00:06:28.750 "send_buf_size": 2097152, 00:06:28.750 "enable_recv_pipe": true, 00:06:28.750 "enable_quickack": false, 00:06:28.750 "enable_placement_id": 0, 00:06:28.750 "enable_zerocopy_send_server": true, 00:06:28.750 "enable_zerocopy_send_client": false, 00:06:28.750 "zerocopy_threshold": 0, 00:06:28.750 "tls_version": 0, 00:06:28.750 "enable_ktls": false 00:06:28.750 } 00:06:28.750 } 00:06:28.750 ] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "vmd", 00:06:28.750 "config": [] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "accel", 00:06:28.750 "config": [ 00:06:28.750 { 00:06:28.750 "method": "accel_set_options", 00:06:28.750 "params": { 00:06:28.750 "small_cache_size": 128, 00:06:28.750 "large_cache_size": 16, 00:06:28.750 "task_count": 2048, 00:06:28.750 "sequence_count": 2048, 00:06:28.750 "buf_count": 2048 00:06:28.750 } 00:06:28.750 } 00:06:28.750 ] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "bdev", 00:06:28.750 "config": [ 00:06:28.750 { 00:06:28.750 "method": "bdev_set_options", 00:06:28.750 "params": { 00:06:28.750 "bdev_io_pool_size": 65535, 00:06:28.750 "bdev_io_cache_size": 256, 00:06:28.750 "bdev_auto_examine": true, 00:06:28.750 "iobuf_small_cache_size": 128, 00:06:28.750 "iobuf_large_cache_size": 16 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "bdev_raid_set_options", 00:06:28.750 "params": { 00:06:28.750 "process_window_size_kb": 1024 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "bdev_iscsi_set_options", 00:06:28.750 "params": { 00:06:28.750 "timeout_sec": 30 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "bdev_nvme_set_options", 00:06:28.750 "params": { 00:06:28.750 "action_on_timeout": "none", 00:06:28.750 "timeout_us": 0, 00:06:28.750 "timeout_admin_us": 0, 00:06:28.750 "keep_alive_timeout_ms": 10000, 00:06:28.750 "arbitration_burst": 0, 00:06:28.750 "low_priority_weight": 0, 00:06:28.750 "medium_priority_weight": 0, 00:06:28.750 "high_priority_weight": 0, 00:06:28.750 "nvme_adminq_poll_period_us": 10000, 00:06:28.750 "nvme_ioq_poll_period_us": 0, 00:06:28.750 "io_queue_requests": 0, 00:06:28.750 "delay_cmd_submit": true, 00:06:28.750 "transport_retry_count": 4, 00:06:28.750 "bdev_retry_count": 3, 00:06:28.750 "transport_ack_timeout": 0, 00:06:28.750 "ctrlr_loss_timeout_sec": 0, 00:06:28.750 "reconnect_delay_sec": 0, 00:06:28.750 "fast_io_fail_timeout_sec": 0, 00:06:28.750 "disable_auto_failback": false, 00:06:28.750 "generate_uuids": false, 00:06:28.750 "transport_tos": 0, 00:06:28.750 "nvme_error_stat": false, 00:06:28.750 "rdma_srq_size": 0, 00:06:28.750 "io_path_stat": false, 00:06:28.750 "allow_accel_sequence": false, 00:06:28.750 "rdma_max_cq_size": 0, 00:06:28.750 "rdma_cm_event_timeout_ms": 0, 00:06:28.750 "dhchap_digests": [ 00:06:28.750 "sha256", 00:06:28.750 "sha384", 00:06:28.750 "sha512" 00:06:28.750 ], 00:06:28.750 "dhchap_dhgroups": [ 00:06:28.750 "null", 00:06:28.750 "ffdhe2048", 00:06:28.750 "ffdhe3072", 00:06:28.750 "ffdhe4096", 00:06:28.750 "ffdhe6144", 00:06:28.750 "ffdhe8192" 00:06:28.750 ] 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "bdev_nvme_set_hotplug", 00:06:28.750 "params": { 00:06:28.750 "period_us": 100000, 00:06:28.750 "enable": false 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "bdev_wait_for_examine" 00:06:28.750 } 00:06:28.750 ] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "scsi", 00:06:28.750 "config": null 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "scheduler", 00:06:28.750 "config": [ 00:06:28.750 { 00:06:28.750 "method": "framework_set_scheduler", 00:06:28.750 "params": { 00:06:28.750 "name": "static" 00:06:28.750 } 00:06:28.750 } 00:06:28.750 ] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "vhost_scsi", 00:06:28.750 "config": [] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "vhost_blk", 00:06:28.750 "config": [] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "ublk", 00:06:28.750 "config": [] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "nbd", 00:06:28.750 "config": [] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "nvmf", 00:06:28.750 "config": [ 00:06:28.750 { 00:06:28.750 "method": "nvmf_set_config", 00:06:28.750 "params": { 00:06:28.750 "discovery_filter": "match_any", 00:06:28.750 "admin_cmd_passthru": { 00:06:28.750 "identify_ctrlr": false 00:06:28.750 } 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "nvmf_set_max_subsystems", 00:06:28.750 "params": { 00:06:28.750 "max_subsystems": 1024 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "nvmf_set_crdt", 00:06:28.750 "params": { 00:06:28.750 "crdt1": 0, 00:06:28.750 "crdt2": 0, 00:06:28.750 "crdt3": 0 00:06:28.750 } 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "method": "nvmf_create_transport", 00:06:28.750 "params": { 00:06:28.750 "trtype": "TCP", 00:06:28.750 "max_queue_depth": 128, 00:06:28.750 "max_io_qpairs_per_ctrlr": 127, 00:06:28.750 "in_capsule_data_size": 4096, 00:06:28.750 "max_io_size": 131072, 00:06:28.750 "io_unit_size": 131072, 00:06:28.750 "max_aq_depth": 128, 00:06:28.750 "num_shared_buffers": 511, 00:06:28.750 "buf_cache_size": 4294967295, 00:06:28.750 "dif_insert_or_strip": false, 00:06:28.750 "zcopy": false, 00:06:28.750 "c2h_success": true, 00:06:28.750 "sock_priority": 0, 00:06:28.750 "abort_timeout_sec": 1, 00:06:28.750 "ack_timeout": 0, 00:06:28.750 "data_wr_pool_size": 0 00:06:28.750 } 00:06:28.750 } 00:06:28.750 ] 00:06:28.750 }, 00:06:28.750 { 00:06:28.750 "subsystem": "iscsi", 00:06:28.750 "config": [ 00:06:28.750 { 00:06:28.750 "method": "iscsi_set_options", 00:06:28.750 "params": { 00:06:28.750 "node_base": "iqn.2016-06.io.spdk", 00:06:28.750 "max_sessions": 128, 00:06:28.750 "max_connections_per_session": 2, 00:06:28.750 "max_queue_depth": 64, 00:06:28.750 "default_time2wait": 2, 00:06:28.750 "default_time2retain": 20, 00:06:28.750 "first_burst_length": 8192, 00:06:28.750 "immediate_data": true, 00:06:28.750 "allow_duplicated_isid": false, 00:06:28.750 "error_recovery_level": 0, 00:06:28.750 "nop_timeout": 60, 00:06:28.750 "nop_in_interval": 30, 00:06:28.750 "disable_chap": false, 00:06:28.750 "require_chap": false, 00:06:28.750 "mutual_chap": false, 00:06:28.750 "chap_group": 0, 00:06:28.750 "max_large_datain_per_connection": 64, 00:06:28.750 "max_r2t_per_connection": 4, 00:06:28.750 "pdu_pool_size": 36864, 00:06:28.750 "immediate_data_pool_size": 16384, 00:06:28.750 "data_out_pool_size": 2048 00:06:28.750 } 00:06:28.750 } 00:06:28.750 ] 00:06:28.750 } 00:06:28.750 ] 00:06:28.750 } 00:06:28.750 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:28.750 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1541262 00:06:28.750 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1541262 ']' 00:06:28.750 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1541262 00:06:28.750 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:28.750 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:28.751 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1541262 00:06:28.751 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:28.751 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:28.751 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1541262' 00:06:28.751 killing process with pid 1541262 00:06:28.751 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1541262 00:06:28.751 07:42:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1541262 00:06:29.010 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1541549 00:06:29.010 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:29.010 07:42:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1541549 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1541549 ']' 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1541549 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1541549 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1541549' 00:06:34.291 killing process with pid 1541549 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1541549 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1541549 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:34.291 00:06:34.291 real 0m6.567s 00:06:34.291 user 0m6.401s 00:06:34.291 sys 0m0.581s 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.291 07:42:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:34.291 ************************************ 00:06:34.291 END TEST skip_rpc_with_json 00:06:34.291 ************************************ 00:06:34.291 07:42:18 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:34.291 07:42:18 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:34.291 07:42:18 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.291 07:42:18 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.291 07:42:18 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:34.291 ************************************ 00:06:34.291 START TEST skip_rpc_with_delay 00:06:34.291 ************************************ 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:34.291 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:34.550 [2024-07-15 07:42:19.077515] app.c: 831:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:34.550 [2024-07-15 07:42:19.077610] app.c: 710:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:34.550 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:34.550 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:34.550 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:34.550 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:34.550 00:06:34.550 real 0m0.089s 00:06:34.550 user 0m0.058s 00:06:34.550 sys 0m0.030s 00:06:34.550 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.550 07:42:19 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:34.550 ************************************ 00:06:34.550 END TEST skip_rpc_with_delay 00:06:34.550 ************************************ 00:06:34.550 07:42:19 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:34.550 07:42:19 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:34.550 07:42:19 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:34.550 07:42:19 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:34.550 07:42:19 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.550 07:42:19 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.550 07:42:19 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:34.550 ************************************ 00:06:34.550 START TEST exit_on_failed_rpc_init 00:06:34.550 ************************************ 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1542513 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1542513 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 1542513 ']' 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.550 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.550 07:42:19 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:34.550 [2024-07-15 07:42:19.240176] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:06:34.550 [2024-07-15 07:42:19.240235] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1542513 ] 00:06:34.809 [2024-07-15 07:42:19.333276] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.809 [2024-07-15 07:42:19.409509] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:35.379 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:35.380 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:35.380 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:35.638 [2024-07-15 07:42:20.143850] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:06:35.638 [2024-07-15 07:42:20.143900] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1542811 ] 00:06:35.638 [2024-07-15 07:42:20.218591] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.638 [2024-07-15 07:42:20.288129] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:35.638 [2024-07-15 07:42:20.288199] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:35.638 [2024-07-15 07:42:20.288212] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:35.638 [2024-07-15 07:42:20.288219] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1542513 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 1542513 ']' 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 1542513 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:35.638 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1542513 00:06:35.898 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:35.898 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:35.898 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1542513' 00:06:35.898 killing process with pid 1542513 00:06:35.898 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 1542513 00:06:35.898 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 1542513 00:06:35.898 00:06:35.898 real 0m1.435s 00:06:35.898 user 0m1.704s 00:06:35.898 sys 0m0.419s 00:06:35.898 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:35.898 07:42:20 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:35.898 ************************************ 00:06:35.898 END TEST exit_on_failed_rpc_init 00:06:35.898 ************************************ 00:06:35.898 07:42:20 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:35.898 07:42:20 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:35.898 00:06:35.898 real 0m13.783s 00:06:35.898 user 0m13.352s 00:06:35.898 sys 0m1.578s 00:06:35.898 07:42:20 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:35.898 07:42:20 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.898 ************************************ 00:06:35.898 END TEST skip_rpc 00:06:35.898 ************************************ 00:06:36.158 07:42:20 -- common/autotest_common.sh@1142 -- # return 0 00:06:36.158 07:42:20 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:36.158 07:42:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:36.158 07:42:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:36.158 07:42:20 -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 ************************************ 00:06:36.158 START TEST rpc_client 00:06:36.158 ************************************ 00:06:36.158 07:42:20 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:36.158 * Looking for test storage... 00:06:36.158 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:36.158 07:42:20 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:36.158 OK 00:06:36.158 07:42:20 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:36.158 00:06:36.158 real 0m0.132s 00:06:36.158 user 0m0.065s 00:06:36.158 sys 0m0.076s 00:06:36.158 07:42:20 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:36.158 07:42:20 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 ************************************ 00:06:36.158 END TEST rpc_client 00:06:36.158 ************************************ 00:06:36.158 07:42:20 -- common/autotest_common.sh@1142 -- # return 0 00:06:36.158 07:42:20 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:36.158 07:42:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:36.158 07:42:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:36.158 07:42:20 -- common/autotest_common.sh@10 -- # set +x 00:06:36.418 ************************************ 00:06:36.418 START TEST json_config 00:06:36.418 ************************************ 00:06:36.418 07:42:20 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:36.418 07:42:21 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:36.418 07:42:21 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:36.418 07:42:21 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:36.418 07:42:21 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:36.418 07:42:21 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:36.418 07:42:21 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:36.418 07:42:21 json_config -- paths/export.sh@5 -- # export PATH 00:06:36.418 07:42:21 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@47 -- # : 0 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:36.418 07:42:21 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:36.418 INFO: JSON configuration test init 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:36.418 07:42:21 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:36.418 07:42:21 json_config -- json_config/common.sh@9 -- # local app=target 00:06:36.418 07:42:21 json_config -- json_config/common.sh@10 -- # shift 00:06:36.418 07:42:21 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:36.418 07:42:21 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:36.418 07:42:21 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:36.418 07:42:21 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:36.418 07:42:21 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:36.418 07:42:21 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1542936 00:06:36.418 07:42:21 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:36.418 Waiting for target to run... 00:06:36.418 07:42:21 json_config -- json_config/common.sh@25 -- # waitforlisten 1542936 /var/tmp/spdk_tgt.sock 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@829 -- # '[' -z 1542936 ']' 00:06:36.418 07:42:21 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:36.418 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:36.418 07:42:21 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:36.418 [2024-07-15 07:42:21.124111] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:06:36.418 [2024-07-15 07:42:21.124177] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1542936 ] 00:06:36.986 [2024-07-15 07:42:21.585448] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.986 [2024-07-15 07:42:21.646687] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.555 07:42:22 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:37.555 07:42:22 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:37.555 07:42:22 json_config -- json_config/common.sh@26 -- # echo '' 00:06:37.555 00:06:37.555 07:42:22 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:37.555 07:42:22 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:37.555 07:42:22 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:37.555 07:42:22 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.555 07:42:22 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:37.555 07:42:22 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:37.555 07:42:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:37.814 07:42:22 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:37.814 07:42:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:38.074 [2024-07-15 07:42:22.673522] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:38.074 07:42:22 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:38.074 07:42:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:38.334 [2024-07-15 07:42:22.857973] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:38.334 07:42:22 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:38.334 07:42:22 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:38.334 07:42:22 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:38.334 07:42:22 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:38.334 07:42:22 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:38.334 07:42:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:38.594 [2024-07-15 07:42:23.110380] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:43.899 07:42:28 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:43.899 07:42:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:43.899 07:42:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:43.899 07:42:28 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:43.899 07:42:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:43.899 07:42:28 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:43.899 07:42:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:43.899 07:42:28 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:43.900 07:42:28 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:43.900 07:42:28 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:43.900 07:42:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:43.900 07:42:28 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:43.900 07:42:28 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:43.900 07:42:28 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:43.900 07:42:28 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:43.900 07:42:28 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:43.900 07:42:28 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:43.900 07:42:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:44.161 Nvme0n1p0 Nvme0n1p1 00:06:44.161 07:42:28 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:44.161 07:42:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:44.161 [2024-07-15 07:42:28.900517] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:44.161 [2024-07-15 07:42:28.900556] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:44.161 00:06:44.161 07:42:28 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:44.161 07:42:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:44.421 Malloc3 00:06:44.421 07:42:29 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:44.421 07:42:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:44.682 [2024-07-15 07:42:29.265485] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:44.682 [2024-07-15 07:42:29.265519] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:44.682 [2024-07-15 07:42:29.265532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f80b20 00:06:44.682 [2024-07-15 07:42:29.265539] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:44.682 [2024-07-15 07:42:29.266757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:44.682 [2024-07-15 07:42:29.266777] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:44.682 PTBdevFromMalloc3 00:06:44.682 07:42:29 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:44.682 07:42:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:44.942 Null0 00:06:44.942 07:42:29 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:44.942 07:42:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:44.942 Malloc0 00:06:44.942 07:42:29 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:44.942 07:42:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:45.202 Malloc1 00:06:45.202 07:42:29 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:45.202 07:42:29 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:45.202 102400+0 records in 00:06:45.202 102400+0 records out 00:06:45.202 104857600 bytes (105 MB, 100 MiB) copied, 0.104183 s, 1.0 GB/s 00:06:45.202 07:42:29 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:45.202 07:42:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:45.462 aio_disk 00:06:45.462 07:42:30 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:45.462 07:42:30 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:45.462 07:42:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:49.667 2920b12f-dbe3-45a1-ac65-e3afaf26e8a6 00:06:49.667 07:42:34 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:49.667 07:42:34 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:49.667 07:42:34 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:49.927 07:42:34 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:49.927 07:42:34 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:49.927 07:42:34 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:49.927 07:42:34 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:50.187 07:42:34 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:50.187 07:42:34 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:50.447 07:42:35 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:50.447 07:42:35 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:50.447 07:42:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:50.447 MallocForCryptoBdev 00:06:50.447 07:42:35 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:50.447 07:42:35 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:50.707 07:42:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:50.707 [2024-07-15 07:42:35.400082] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:50.707 CryptoMallocBdev 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:3ea22e30-f065-4381-8523-ce0202f6b2ee bdev_register:038f7937-948e-4378-9920-21f2fb743390 bdev_register:546ea5c4-7084-4242-b36a-93705d1674db bdev_register:f70e0e0d-c7bd-4aab-8baa-dc6f713edba3 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:3ea22e30-f065-4381-8523-ce0202f6b2ee bdev_register:038f7937-948e-4378-9920-21f2fb743390 bdev_register:546ea5c4-7084-4242-b36a-93705d1674db bdev_register:f70e0e0d-c7bd-4aab-8baa-dc6f713edba3 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@71 -- # sort 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@72 -- # sort 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:50.707 07:42:35 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:50.707 07:42:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:3ea22e30-f065-4381-8523-ce0202f6b2ee 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:038f7937-948e-4378-9920-21f2fb743390 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:546ea5c4-7084-4242-b36a-93705d1674db 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:f70e0e0d-c7bd-4aab-8baa-dc6f713edba3 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:038f7937-948e-4378-9920-21f2fb743390 bdev_register:3ea22e30-f065-4381-8523-ce0202f6b2ee bdev_register:546ea5c4-7084-4242-b36a-93705d1674db bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:f70e0e0d-c7bd-4aab-8baa-dc6f713edba3 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\3\8\f\7\9\3\7\-\9\4\8\e\-\4\3\7\8\-\9\9\2\0\-\2\1\f\2\f\b\7\4\3\3\9\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\e\a\2\2\e\3\0\-\f\0\6\5\-\4\3\8\1\-\8\5\2\3\-\c\e\0\2\0\2\f\6\b\2\e\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\5\4\6\e\a\5\c\4\-\7\0\8\4\-\4\2\4\2\-\b\3\6\a\-\9\3\7\0\5\d\1\6\7\4\d\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\f\7\0\e\0\e\0\d\-\c\7\b\d\-\4\a\a\b\-\8\b\a\a\-\d\c\6\f\7\1\3\e\d\b\a\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@86 -- # cat 00:06:50.968 07:42:35 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:038f7937-948e-4378-9920-21f2fb743390 bdev_register:3ea22e30-f065-4381-8523-ce0202f6b2ee bdev_register:546ea5c4-7084-4242-b36a-93705d1674db bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:f70e0e0d-c7bd-4aab-8baa-dc6f713edba3 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:50.968 Expected events matched: 00:06:50.968 bdev_register:038f7937-948e-4378-9920-21f2fb743390 00:06:50.968 bdev_register:3ea22e30-f065-4381-8523-ce0202f6b2ee 00:06:50.968 bdev_register:546ea5c4-7084-4242-b36a-93705d1674db 00:06:50.968 bdev_register:aio_disk 00:06:50.968 bdev_register:CryptoMallocBdev 00:06:50.968 bdev_register:f70e0e0d-c7bd-4aab-8baa-dc6f713edba3 00:06:50.968 bdev_register:Malloc0 00:06:50.969 bdev_register:Malloc0p0 00:06:50.969 bdev_register:Malloc0p1 00:06:50.969 bdev_register:Malloc0p2 00:06:50.969 bdev_register:Malloc1 00:06:50.969 bdev_register:Malloc3 00:06:50.969 bdev_register:MallocForCryptoBdev 00:06:50.969 bdev_register:Null0 00:06:50.969 bdev_register:Nvme0n1 00:06:50.969 bdev_register:Nvme0n1p0 00:06:50.969 bdev_register:Nvme0n1p1 00:06:50.969 bdev_register:PTBdevFromMalloc3 00:06:50.969 07:42:35 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:50.969 07:42:35 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:50.969 07:42:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:50.969 07:42:35 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:50.969 07:42:35 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:50.969 07:42:35 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:50.969 07:42:35 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:50.969 07:42:35 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:50.969 07:42:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:50.969 07:42:35 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:50.969 07:42:35 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:51.229 07:42:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:51.229 MallocBdevForConfigChangeCheck 00:06:51.229 07:42:35 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:51.229 07:42:35 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:51.229 07:42:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.229 07:42:35 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:51.229 07:42:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:51.799 07:42:36 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:51.799 INFO: shutting down applications... 00:06:51.799 07:42:36 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:51.799 07:42:36 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:51.799 07:42:36 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:51.799 07:42:36 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:51.799 [2024-07-15 07:42:36.455156] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:54.343 Calling clear_iscsi_subsystem 00:06:54.343 Calling clear_nvmf_subsystem 00:06:54.343 Calling clear_nbd_subsystem 00:06:54.343 Calling clear_ublk_subsystem 00:06:54.343 Calling clear_vhost_blk_subsystem 00:06:54.343 Calling clear_vhost_scsi_subsystem 00:06:54.343 Calling clear_bdev_subsystem 00:06:54.343 07:42:39 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:54.343 07:42:39 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:54.343 07:42:39 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:54.343 07:42:39 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:54.343 07:42:39 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:54.343 07:42:39 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:54.603 07:42:39 json_config -- json_config/json_config.sh@345 -- # break 00:06:54.603 07:42:39 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:54.603 07:42:39 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:54.603 07:42:39 json_config -- json_config/common.sh@31 -- # local app=target 00:06:54.603 07:42:39 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:54.603 07:42:39 json_config -- json_config/common.sh@35 -- # [[ -n 1542936 ]] 00:06:54.603 07:42:39 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1542936 00:06:54.603 07:42:39 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:54.603 07:42:39 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:54.603 07:42:39 json_config -- json_config/common.sh@41 -- # kill -0 1542936 00:06:54.603 07:42:39 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:55.174 07:42:39 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:55.174 07:42:39 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:55.174 07:42:39 json_config -- json_config/common.sh@41 -- # kill -0 1542936 00:06:55.174 07:42:39 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:55.174 07:42:39 json_config -- json_config/common.sh@43 -- # break 00:06:55.174 07:42:39 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:55.174 07:42:39 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:55.174 SPDK target shutdown done 00:06:55.174 07:42:39 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:55.174 INFO: relaunching applications... 00:06:55.174 07:42:39 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:55.174 07:42:39 json_config -- json_config/common.sh@9 -- # local app=target 00:06:55.174 07:42:39 json_config -- json_config/common.sh@10 -- # shift 00:06:55.174 07:42:39 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:55.174 07:42:39 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:55.174 07:42:39 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:55.174 07:42:39 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:55.174 07:42:39 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:55.174 07:42:39 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1546380 00:06:55.174 07:42:39 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:55.174 Waiting for target to run... 00:06:55.174 07:42:39 json_config -- json_config/common.sh@25 -- # waitforlisten 1546380 /var/tmp/spdk_tgt.sock 00:06:55.174 07:42:39 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:55.174 07:42:39 json_config -- common/autotest_common.sh@829 -- # '[' -z 1546380 ']' 00:06:55.174 07:42:39 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:55.174 07:42:39 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:55.174 07:42:39 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:55.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:55.174 07:42:39 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:55.174 07:42:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:55.174 [2024-07-15 07:42:39.923791] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:06:55.174 [2024-07-15 07:42:39.923856] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1546380 ] 00:06:55.744 [2024-07-15 07:42:40.272123] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.744 [2024-07-15 07:42:40.320374] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.744 [2024-07-15 07:42:40.374172] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:55.744 [2024-07-15 07:42:40.382204] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:55.744 [2024-07-15 07:42:40.390223] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:55.744 [2024-07-15 07:42:40.470398] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:58.285 [2024-07-15 07:42:42.604337] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:58.285 [2024-07-15 07:42:42.604387] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:58.285 [2024-07-15 07:42:42.604395] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:58.285 [2024-07-15 07:42:42.612350] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:58.285 [2024-07-15 07:42:42.612368] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:58.285 [2024-07-15 07:42:42.620365] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:58.285 [2024-07-15 07:42:42.620380] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:58.285 [2024-07-15 07:42:42.628398] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:58.285 [2024-07-15 07:42:42.628414] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:58.285 [2024-07-15 07:42:42.628421] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:00.824 [2024-07-15 07:42:45.486699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:00.824 [2024-07-15 07:42:45.486739] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:00.824 [2024-07-15 07:42:45.486749] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xda3b20 00:07:00.824 [2024-07-15 07:42:45.486756] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:00.824 [2024-07-15 07:42:45.486989] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:00.824 [2024-07-15 07:42:45.487001] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:01.393 07:42:45 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:01.393 07:42:45 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:01.393 07:42:45 json_config -- json_config/common.sh@26 -- # echo '' 00:07:01.393 00:07:01.393 07:42:45 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:07:01.393 07:42:45 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:01.393 INFO: Checking if target configuration is the same... 00:07:01.393 07:42:45 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:01.393 07:42:45 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:07:01.393 07:42:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:01.393 + '[' 2 -ne 2 ']' 00:07:01.393 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:01.393 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:01.393 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:01.393 +++ basename /dev/fd/62 00:07:01.393 ++ mktemp /tmp/62.XXX 00:07:01.393 + tmp_file_1=/tmp/62.4Ul 00:07:01.393 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:01.393 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:01.393 + tmp_file_2=/tmp/spdk_tgt_config.json.9Ml 00:07:01.393 + ret=0 00:07:01.393 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:01.652 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:01.652 + diff -u /tmp/62.4Ul /tmp/spdk_tgt_config.json.9Ml 00:07:01.652 + echo 'INFO: JSON config files are the same' 00:07:01.652 INFO: JSON config files are the same 00:07:01.652 + rm /tmp/62.4Ul /tmp/spdk_tgt_config.json.9Ml 00:07:01.652 + exit 0 00:07:01.652 07:42:46 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:07:01.652 07:42:46 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:01.652 INFO: changing configuration and checking if this can be detected... 00:07:01.652 07:42:46 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:01.652 07:42:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:01.912 07:42:46 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:01.912 07:42:46 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:07:01.912 07:42:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:01.912 + '[' 2 -ne 2 ']' 00:07:01.912 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:01.912 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:01.912 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:01.912 +++ basename /dev/fd/62 00:07:01.912 ++ mktemp /tmp/62.XXX 00:07:01.912 + tmp_file_1=/tmp/62.68n 00:07:01.912 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:01.912 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:01.912 + tmp_file_2=/tmp/spdk_tgt_config.json.zT1 00:07:01.912 + ret=0 00:07:01.912 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:02.171 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:02.171 + diff -u /tmp/62.68n /tmp/spdk_tgt_config.json.zT1 00:07:02.171 + ret=1 00:07:02.171 + echo '=== Start of file: /tmp/62.68n ===' 00:07:02.171 + cat /tmp/62.68n 00:07:02.171 + echo '=== End of file: /tmp/62.68n ===' 00:07:02.171 + echo '' 00:07:02.171 + echo '=== Start of file: /tmp/spdk_tgt_config.json.zT1 ===' 00:07:02.171 + cat /tmp/spdk_tgt_config.json.zT1 00:07:02.171 + echo '=== End of file: /tmp/spdk_tgt_config.json.zT1 ===' 00:07:02.171 + echo '' 00:07:02.171 + rm /tmp/62.68n /tmp/spdk_tgt_config.json.zT1 00:07:02.171 + exit 1 00:07:02.171 07:42:46 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:07:02.172 INFO: configuration change detected. 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:07:02.172 07:42:46 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:02.172 07:42:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@317 -- # [[ -n 1546380 ]] 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:07:02.172 07:42:46 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:02.172 07:42:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:07:02.172 07:42:46 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:02.172 07:42:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:02.431 07:42:47 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:02.431 07:42:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:02.690 07:42:47 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:02.690 07:42:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:02.949 07:42:47 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:02.949 07:42:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:02.949 07:42:47 json_config -- json_config/json_config.sh@193 -- # uname -s 00:07:02.949 07:42:47 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:07:02.949 07:42:47 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:07:02.949 07:42:47 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:07:02.950 07:42:47 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:07:02.950 07:42:47 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:02.950 07:42:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:02.950 07:42:47 json_config -- json_config/json_config.sh@323 -- # killprocess 1546380 00:07:02.950 07:42:47 json_config -- common/autotest_common.sh@948 -- # '[' -z 1546380 ']' 00:07:02.950 07:42:47 json_config -- common/autotest_common.sh@952 -- # kill -0 1546380 00:07:02.950 07:42:47 json_config -- common/autotest_common.sh@953 -- # uname 00:07:02.950 07:42:47 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:02.950 07:42:47 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1546380 00:07:03.213 07:42:47 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:03.213 07:42:47 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:03.213 07:42:47 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1546380' 00:07:03.213 killing process with pid 1546380 00:07:03.213 07:42:47 json_config -- common/autotest_common.sh@967 -- # kill 1546380 00:07:03.213 07:42:47 json_config -- common/autotest_common.sh@972 -- # wait 1546380 00:07:05.811 07:42:50 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:05.811 07:42:50 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:07:05.811 07:42:50 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:05.811 07:42:50 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.811 07:42:50 json_config -- json_config/json_config.sh@328 -- # return 0 00:07:05.811 07:42:50 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:07:05.811 INFO: Success 00:07:05.811 00:07:05.811 real 0m29.373s 00:07:05.811 user 0m33.578s 00:07:05.811 sys 0m2.875s 00:07:05.811 07:42:50 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:05.811 07:42:50 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.811 ************************************ 00:07:05.811 END TEST json_config 00:07:05.811 ************************************ 00:07:05.811 07:42:50 -- common/autotest_common.sh@1142 -- # return 0 00:07:05.811 07:42:50 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:05.811 07:42:50 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:05.811 07:42:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:05.811 07:42:50 -- common/autotest_common.sh@10 -- # set +x 00:07:05.811 ************************************ 00:07:05.811 START TEST json_config_extra_key 00:07:05.811 ************************************ 00:07:05.811 07:42:50 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:05.811 07:42:50 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:05.811 07:42:50 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:05.811 07:42:50 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:05.811 07:42:50 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:05.811 07:42:50 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:05.811 07:42:50 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:05.811 07:42:50 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:05.811 07:42:50 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:05.811 07:42:50 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:05.811 INFO: launching applications... 00:07:05.811 07:42:50 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1548192 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:05.811 Waiting for target to run... 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1548192 /var/tmp/spdk_tgt.sock 00:07:05.811 07:42:50 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 1548192 ']' 00:07:05.811 07:42:50 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:05.811 07:42:50 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:05.811 07:42:50 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:05.811 07:42:50 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:05.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:05.811 07:42:50 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:05.811 07:42:50 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:05.811 [2024-07-15 07:42:50.551596] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:05.811 [2024-07-15 07:42:50.551664] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1548192 ] 00:07:06.382 [2024-07-15 07:42:50.908760] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.382 [2024-07-15 07:42:50.966605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.642 07:42:51 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:06.642 07:42:51 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:06.642 00:07:06.642 07:42:51 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:06.642 INFO: shutting down applications... 00:07:06.642 07:42:51 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1548192 ]] 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1548192 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1548192 00:07:06.642 07:42:51 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:07.212 07:42:51 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:07.212 07:42:51 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:07.212 07:42:51 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1548192 00:07:07.212 07:42:51 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:07.212 07:42:51 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:07.212 07:42:51 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:07.212 07:42:51 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:07.212 SPDK target shutdown done 00:07:07.212 07:42:51 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:07.212 Success 00:07:07.212 00:07:07.212 real 0m1.510s 00:07:07.212 user 0m1.041s 00:07:07.212 sys 0m0.448s 00:07:07.212 07:42:51 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:07.212 07:42:51 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:07.212 ************************************ 00:07:07.212 END TEST json_config_extra_key 00:07:07.212 ************************************ 00:07:07.212 07:42:51 -- common/autotest_common.sh@1142 -- # return 0 00:07:07.212 07:42:51 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:07.212 07:42:51 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:07.212 07:42:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.212 07:42:51 -- common/autotest_common.sh@10 -- # set +x 00:07:07.212 ************************************ 00:07:07.212 START TEST alias_rpc 00:07:07.212 ************************************ 00:07:07.212 07:42:51 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:07.474 * Looking for test storage... 00:07:07.474 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:07.474 07:42:52 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:07.474 07:42:52 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1548476 00:07:07.474 07:42:52 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1548476 00:07:07.474 07:42:52 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:07.474 07:42:52 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 1548476 ']' 00:07:07.474 07:42:52 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:07.474 07:42:52 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:07.474 07:42:52 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:07.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:07.474 07:42:52 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:07.474 07:42:52 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:07.474 [2024-07-15 07:42:52.131215] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:07.474 [2024-07-15 07:42:52.131281] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1548476 ] 00:07:07.474 [2024-07-15 07:42:52.223978] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.735 [2024-07-15 07:42:52.292112] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.305 07:42:52 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:08.305 07:42:52 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:08.305 07:42:52 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:08.565 07:42:53 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1548476 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 1548476 ']' 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 1548476 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1548476 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1548476' 00:07:08.565 killing process with pid 1548476 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@967 -- # kill 1548476 00:07:08.565 07:42:53 alias_rpc -- common/autotest_common.sh@972 -- # wait 1548476 00:07:08.826 00:07:08.826 real 0m1.449s 00:07:08.826 user 0m1.633s 00:07:08.826 sys 0m0.392s 00:07:08.826 07:42:53 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:08.826 07:42:53 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:08.826 ************************************ 00:07:08.826 END TEST alias_rpc 00:07:08.826 ************************************ 00:07:08.826 07:42:53 -- common/autotest_common.sh@1142 -- # return 0 00:07:08.826 07:42:53 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:08.826 07:42:53 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:08.826 07:42:53 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:08.826 07:42:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.826 07:42:53 -- common/autotest_common.sh@10 -- # set +x 00:07:08.826 ************************************ 00:07:08.826 START TEST spdkcli_tcp 00:07:08.826 ************************************ 00:07:08.826 07:42:53 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:08.826 * Looking for test storage... 00:07:08.826 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:09.086 07:42:53 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:09.086 07:42:53 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1548813 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1548813 00:07:09.086 07:42:53 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:09.086 07:42:53 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 1548813 ']' 00:07:09.086 07:42:53 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:09.086 07:42:53 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:09.086 07:42:53 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:09.086 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:09.086 07:42:53 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:09.086 07:42:53 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:09.086 [2024-07-15 07:42:53.652258] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:09.086 [2024-07-15 07:42:53.652310] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1548813 ] 00:07:09.086 [2024-07-15 07:42:53.740151] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:09.086 [2024-07-15 07:42:53.805337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.086 [2024-07-15 07:42:53.805342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.030 07:42:54 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:10.030 07:42:54 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:07:10.030 07:42:54 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1549104 00:07:10.030 07:42:54 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:10.030 07:42:54 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:10.030 [ 00:07:10.030 "bdev_malloc_delete", 00:07:10.030 "bdev_malloc_create", 00:07:10.030 "bdev_null_resize", 00:07:10.030 "bdev_null_delete", 00:07:10.030 "bdev_null_create", 00:07:10.030 "bdev_nvme_cuse_unregister", 00:07:10.030 "bdev_nvme_cuse_register", 00:07:10.030 "bdev_opal_new_user", 00:07:10.030 "bdev_opal_set_lock_state", 00:07:10.030 "bdev_opal_delete", 00:07:10.030 "bdev_opal_get_info", 00:07:10.030 "bdev_opal_create", 00:07:10.030 "bdev_nvme_opal_revert", 00:07:10.030 "bdev_nvme_opal_init", 00:07:10.030 "bdev_nvme_send_cmd", 00:07:10.030 "bdev_nvme_get_path_iostat", 00:07:10.030 "bdev_nvme_get_mdns_discovery_info", 00:07:10.030 "bdev_nvme_stop_mdns_discovery", 00:07:10.030 "bdev_nvme_start_mdns_discovery", 00:07:10.030 "bdev_nvme_set_multipath_policy", 00:07:10.030 "bdev_nvme_set_preferred_path", 00:07:10.030 "bdev_nvme_get_io_paths", 00:07:10.030 "bdev_nvme_remove_error_injection", 00:07:10.030 "bdev_nvme_add_error_injection", 00:07:10.030 "bdev_nvme_get_discovery_info", 00:07:10.030 "bdev_nvme_stop_discovery", 00:07:10.030 "bdev_nvme_start_discovery", 00:07:10.030 "bdev_nvme_get_controller_health_info", 00:07:10.030 "bdev_nvme_disable_controller", 00:07:10.030 "bdev_nvme_enable_controller", 00:07:10.030 "bdev_nvme_reset_controller", 00:07:10.030 "bdev_nvme_get_transport_statistics", 00:07:10.030 "bdev_nvme_apply_firmware", 00:07:10.030 "bdev_nvme_detach_controller", 00:07:10.030 "bdev_nvme_get_controllers", 00:07:10.030 "bdev_nvme_attach_controller", 00:07:10.031 "bdev_nvme_set_hotplug", 00:07:10.031 "bdev_nvme_set_options", 00:07:10.031 "bdev_passthru_delete", 00:07:10.031 "bdev_passthru_create", 00:07:10.031 "bdev_lvol_set_parent_bdev", 00:07:10.031 "bdev_lvol_set_parent", 00:07:10.031 "bdev_lvol_check_shallow_copy", 00:07:10.031 "bdev_lvol_start_shallow_copy", 00:07:10.031 "bdev_lvol_grow_lvstore", 00:07:10.031 "bdev_lvol_get_lvols", 00:07:10.031 "bdev_lvol_get_lvstores", 00:07:10.031 "bdev_lvol_delete", 00:07:10.031 "bdev_lvol_set_read_only", 00:07:10.031 "bdev_lvol_resize", 00:07:10.031 "bdev_lvol_decouple_parent", 00:07:10.031 "bdev_lvol_inflate", 00:07:10.031 "bdev_lvol_rename", 00:07:10.031 "bdev_lvol_clone_bdev", 00:07:10.031 "bdev_lvol_clone", 00:07:10.031 "bdev_lvol_snapshot", 00:07:10.031 "bdev_lvol_create", 00:07:10.031 "bdev_lvol_delete_lvstore", 00:07:10.031 "bdev_lvol_rename_lvstore", 00:07:10.031 "bdev_lvol_create_lvstore", 00:07:10.031 "bdev_raid_set_options", 00:07:10.031 "bdev_raid_remove_base_bdev", 00:07:10.031 "bdev_raid_add_base_bdev", 00:07:10.031 "bdev_raid_delete", 00:07:10.031 "bdev_raid_create", 00:07:10.031 "bdev_raid_get_bdevs", 00:07:10.031 "bdev_error_inject_error", 00:07:10.031 "bdev_error_delete", 00:07:10.031 "bdev_error_create", 00:07:10.031 "bdev_split_delete", 00:07:10.031 "bdev_split_create", 00:07:10.031 "bdev_delay_delete", 00:07:10.031 "bdev_delay_create", 00:07:10.031 "bdev_delay_update_latency", 00:07:10.031 "bdev_zone_block_delete", 00:07:10.031 "bdev_zone_block_create", 00:07:10.031 "blobfs_create", 00:07:10.031 "blobfs_detect", 00:07:10.031 "blobfs_set_cache_size", 00:07:10.031 "bdev_crypto_delete", 00:07:10.031 "bdev_crypto_create", 00:07:10.031 "bdev_compress_delete", 00:07:10.031 "bdev_compress_create", 00:07:10.031 "bdev_compress_get_orphans", 00:07:10.031 "bdev_aio_delete", 00:07:10.031 "bdev_aio_rescan", 00:07:10.031 "bdev_aio_create", 00:07:10.031 "bdev_ftl_set_property", 00:07:10.031 "bdev_ftl_get_properties", 00:07:10.031 "bdev_ftl_get_stats", 00:07:10.031 "bdev_ftl_unmap", 00:07:10.031 "bdev_ftl_unload", 00:07:10.031 "bdev_ftl_delete", 00:07:10.031 "bdev_ftl_load", 00:07:10.031 "bdev_ftl_create", 00:07:10.031 "bdev_virtio_attach_controller", 00:07:10.031 "bdev_virtio_scsi_get_devices", 00:07:10.031 "bdev_virtio_detach_controller", 00:07:10.031 "bdev_virtio_blk_set_hotplug", 00:07:10.031 "bdev_iscsi_delete", 00:07:10.031 "bdev_iscsi_create", 00:07:10.031 "bdev_iscsi_set_options", 00:07:10.031 "accel_error_inject_error", 00:07:10.031 "ioat_scan_accel_module", 00:07:10.031 "dsa_scan_accel_module", 00:07:10.031 "iaa_scan_accel_module", 00:07:10.031 "dpdk_cryptodev_get_driver", 00:07:10.031 "dpdk_cryptodev_set_driver", 00:07:10.031 "dpdk_cryptodev_scan_accel_module", 00:07:10.031 "compressdev_scan_accel_module", 00:07:10.031 "keyring_file_remove_key", 00:07:10.031 "keyring_file_add_key", 00:07:10.031 "keyring_linux_set_options", 00:07:10.031 "iscsi_get_histogram", 00:07:10.031 "iscsi_enable_histogram", 00:07:10.031 "iscsi_set_options", 00:07:10.031 "iscsi_get_auth_groups", 00:07:10.031 "iscsi_auth_group_remove_secret", 00:07:10.031 "iscsi_auth_group_add_secret", 00:07:10.031 "iscsi_delete_auth_group", 00:07:10.031 "iscsi_create_auth_group", 00:07:10.031 "iscsi_set_discovery_auth", 00:07:10.031 "iscsi_get_options", 00:07:10.031 "iscsi_target_node_request_logout", 00:07:10.031 "iscsi_target_node_set_redirect", 00:07:10.031 "iscsi_target_node_set_auth", 00:07:10.031 "iscsi_target_node_add_lun", 00:07:10.031 "iscsi_get_stats", 00:07:10.031 "iscsi_get_connections", 00:07:10.031 "iscsi_portal_group_set_auth", 00:07:10.031 "iscsi_start_portal_group", 00:07:10.031 "iscsi_delete_portal_group", 00:07:10.031 "iscsi_create_portal_group", 00:07:10.031 "iscsi_get_portal_groups", 00:07:10.031 "iscsi_delete_target_node", 00:07:10.031 "iscsi_target_node_remove_pg_ig_maps", 00:07:10.031 "iscsi_target_node_add_pg_ig_maps", 00:07:10.031 "iscsi_create_target_node", 00:07:10.031 "iscsi_get_target_nodes", 00:07:10.031 "iscsi_delete_initiator_group", 00:07:10.031 "iscsi_initiator_group_remove_initiators", 00:07:10.031 "iscsi_initiator_group_add_initiators", 00:07:10.031 "iscsi_create_initiator_group", 00:07:10.031 "iscsi_get_initiator_groups", 00:07:10.031 "nvmf_set_crdt", 00:07:10.031 "nvmf_set_config", 00:07:10.031 "nvmf_set_max_subsystems", 00:07:10.031 "nvmf_stop_mdns_prr", 00:07:10.031 "nvmf_publish_mdns_prr", 00:07:10.031 "nvmf_subsystem_get_listeners", 00:07:10.031 "nvmf_subsystem_get_qpairs", 00:07:10.031 "nvmf_subsystem_get_controllers", 00:07:10.031 "nvmf_get_stats", 00:07:10.031 "nvmf_get_transports", 00:07:10.031 "nvmf_create_transport", 00:07:10.031 "nvmf_get_targets", 00:07:10.031 "nvmf_delete_target", 00:07:10.031 "nvmf_create_target", 00:07:10.031 "nvmf_subsystem_allow_any_host", 00:07:10.031 "nvmf_subsystem_remove_host", 00:07:10.031 "nvmf_subsystem_add_host", 00:07:10.031 "nvmf_ns_remove_host", 00:07:10.031 "nvmf_ns_add_host", 00:07:10.031 "nvmf_subsystem_remove_ns", 00:07:10.031 "nvmf_subsystem_add_ns", 00:07:10.031 "nvmf_subsystem_listener_set_ana_state", 00:07:10.031 "nvmf_discovery_get_referrals", 00:07:10.031 "nvmf_discovery_remove_referral", 00:07:10.031 "nvmf_discovery_add_referral", 00:07:10.031 "nvmf_subsystem_remove_listener", 00:07:10.031 "nvmf_subsystem_add_listener", 00:07:10.031 "nvmf_delete_subsystem", 00:07:10.031 "nvmf_create_subsystem", 00:07:10.031 "nvmf_get_subsystems", 00:07:10.031 "env_dpdk_get_mem_stats", 00:07:10.031 "nbd_get_disks", 00:07:10.031 "nbd_stop_disk", 00:07:10.031 "nbd_start_disk", 00:07:10.031 "ublk_recover_disk", 00:07:10.031 "ublk_get_disks", 00:07:10.031 "ublk_stop_disk", 00:07:10.031 "ublk_start_disk", 00:07:10.031 "ublk_destroy_target", 00:07:10.031 "ublk_create_target", 00:07:10.031 "virtio_blk_create_transport", 00:07:10.031 "virtio_blk_get_transports", 00:07:10.031 "vhost_controller_set_coalescing", 00:07:10.031 "vhost_get_controllers", 00:07:10.031 "vhost_delete_controller", 00:07:10.031 "vhost_create_blk_controller", 00:07:10.031 "vhost_scsi_controller_remove_target", 00:07:10.031 "vhost_scsi_controller_add_target", 00:07:10.031 "vhost_start_scsi_controller", 00:07:10.031 "vhost_create_scsi_controller", 00:07:10.031 "thread_set_cpumask", 00:07:10.031 "framework_get_governor", 00:07:10.031 "framework_get_scheduler", 00:07:10.031 "framework_set_scheduler", 00:07:10.031 "framework_get_reactors", 00:07:10.031 "thread_get_io_channels", 00:07:10.031 "thread_get_pollers", 00:07:10.031 "thread_get_stats", 00:07:10.031 "framework_monitor_context_switch", 00:07:10.031 "spdk_kill_instance", 00:07:10.031 "log_enable_timestamps", 00:07:10.031 "log_get_flags", 00:07:10.031 "log_clear_flag", 00:07:10.031 "log_set_flag", 00:07:10.031 "log_get_level", 00:07:10.031 "log_set_level", 00:07:10.031 "log_get_print_level", 00:07:10.031 "log_set_print_level", 00:07:10.031 "framework_enable_cpumask_locks", 00:07:10.031 "framework_disable_cpumask_locks", 00:07:10.031 "framework_wait_init", 00:07:10.031 "framework_start_init", 00:07:10.031 "scsi_get_devices", 00:07:10.031 "bdev_get_histogram", 00:07:10.031 "bdev_enable_histogram", 00:07:10.031 "bdev_set_qos_limit", 00:07:10.031 "bdev_set_qd_sampling_period", 00:07:10.031 "bdev_get_bdevs", 00:07:10.031 "bdev_reset_iostat", 00:07:10.031 "bdev_get_iostat", 00:07:10.031 "bdev_examine", 00:07:10.031 "bdev_wait_for_examine", 00:07:10.031 "bdev_set_options", 00:07:10.031 "notify_get_notifications", 00:07:10.031 "notify_get_types", 00:07:10.031 "accel_get_stats", 00:07:10.031 "accel_set_options", 00:07:10.031 "accel_set_driver", 00:07:10.031 "accel_crypto_key_destroy", 00:07:10.031 "accel_crypto_keys_get", 00:07:10.031 "accel_crypto_key_create", 00:07:10.031 "accel_assign_opc", 00:07:10.031 "accel_get_module_info", 00:07:10.031 "accel_get_opc_assignments", 00:07:10.031 "vmd_rescan", 00:07:10.031 "vmd_remove_device", 00:07:10.031 "vmd_enable", 00:07:10.031 "sock_get_default_impl", 00:07:10.031 "sock_set_default_impl", 00:07:10.031 "sock_impl_set_options", 00:07:10.031 "sock_impl_get_options", 00:07:10.031 "iobuf_get_stats", 00:07:10.031 "iobuf_set_options", 00:07:10.031 "framework_get_pci_devices", 00:07:10.031 "framework_get_config", 00:07:10.031 "framework_get_subsystems", 00:07:10.031 "trace_get_info", 00:07:10.031 "trace_get_tpoint_group_mask", 00:07:10.031 "trace_disable_tpoint_group", 00:07:10.031 "trace_enable_tpoint_group", 00:07:10.031 "trace_clear_tpoint_mask", 00:07:10.031 "trace_set_tpoint_mask", 00:07:10.031 "keyring_get_keys", 00:07:10.031 "spdk_get_version", 00:07:10.031 "rpc_get_methods" 00:07:10.031 ] 00:07:10.031 07:42:54 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:10.031 07:42:54 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:10.031 07:42:54 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1548813 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 1548813 ']' 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 1548813 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1548813 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1548813' 00:07:10.031 killing process with pid 1548813 00:07:10.031 07:42:54 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 1548813 00:07:10.032 07:42:54 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 1548813 00:07:10.292 00:07:10.292 real 0m1.496s 00:07:10.292 user 0m2.847s 00:07:10.292 sys 0m0.424s 00:07:10.292 07:42:54 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.292 07:42:54 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:10.292 ************************************ 00:07:10.292 END TEST spdkcli_tcp 00:07:10.292 ************************************ 00:07:10.292 07:42:55 -- common/autotest_common.sh@1142 -- # return 0 00:07:10.292 07:42:55 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:10.292 07:42:55 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:10.292 07:42:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.292 07:42:55 -- common/autotest_common.sh@10 -- # set +x 00:07:10.552 ************************************ 00:07:10.552 START TEST dpdk_mem_utility 00:07:10.552 ************************************ 00:07:10.552 07:42:55 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:10.552 * Looking for test storage... 00:07:10.552 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:10.552 07:42:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:10.552 07:42:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1549190 00:07:10.552 07:42:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1549190 00:07:10.552 07:42:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:10.552 07:42:55 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 1549190 ']' 00:07:10.552 07:42:55 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:10.552 07:42:55 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:10.552 07:42:55 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:10.552 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:10.552 07:42:55 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:10.552 07:42:55 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:10.552 [2024-07-15 07:42:55.222954] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:10.552 [2024-07-15 07:42:55.223007] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1549190 ] 00:07:10.812 [2024-07-15 07:42:55.311167] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.812 [2024-07-15 07:42:55.376795] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.383 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:11.383 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:07:11.383 07:42:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:11.383 07:42:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:11.383 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:11.383 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:11.383 { 00:07:11.383 "filename": "/tmp/spdk_mem_dump.txt" 00:07:11.383 } 00:07:11.383 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:11.383 07:42:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:11.383 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:11.383 2 heaps totaling size 816.000000 MiB 00:07:11.383 size: 814.000000 MiB heap id: 0 00:07:11.383 size: 2.000000 MiB heap id: 1 00:07:11.383 end heaps---------- 00:07:11.383 8 mempools totaling size 598.116089 MiB 00:07:11.383 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:11.383 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:11.383 size: 84.521057 MiB name: bdev_io_1549190 00:07:11.383 size: 51.011292 MiB name: evtpool_1549190 00:07:11.383 size: 50.003479 MiB name: msgpool_1549190 00:07:11.383 size: 21.763794 MiB name: PDU_Pool 00:07:11.383 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:11.383 size: 0.026123 MiB name: Session_Pool 00:07:11.383 end mempools------- 00:07:11.383 201 memzones totaling size 4.176453 MiB 00:07:11.383 size: 1.000366 MiB name: RG_ring_0_1549190 00:07:11.383 size: 1.000366 MiB name: RG_ring_1_1549190 00:07:11.383 size: 1.000366 MiB name: RG_ring_4_1549190 00:07:11.383 size: 1.000366 MiB name: RG_ring_5_1549190 00:07:11.383 size: 0.125366 MiB name: RG_ring_2_1549190 00:07:11.383 size: 0.015991 MiB name: RG_ring_3_1549190 00:07:11.383 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:11.383 size: 0.000305 MiB name: 0000:cc:01.0_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:01.1_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:01.2_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:01.3_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:01.4_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:01.5_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:01.6_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:01.7_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:02.0_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:02.1_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:02.2_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:02.3_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:02.4_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:02.5_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:02.6_qat 00:07:11.383 size: 0.000305 MiB name: 0000:cc:02.7_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:01.0_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:01.1_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:01.2_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:01.3_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:01.4_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:01.5_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:01.6_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:01.7_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:02.0_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:02.1_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:02.2_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:02.3_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:02.4_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:02.5_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:02.6_qat 00:07:11.383 size: 0.000305 MiB name: 0000:ce:02.7_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:01.0_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:01.1_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:01.2_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:01.3_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:01.4_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:01.5_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:01.6_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:01.7_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:02.0_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:02.1_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:02.2_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:02.3_qat 00:07:11.383 size: 0.000305 MiB name: 0000:d0:02.4_qat 00:07:11.384 size: 0.000305 MiB name: 0000:d0:02.5_qat 00:07:11.384 size: 0.000305 MiB name: 0000:d0:02.6_qat 00:07:11.384 size: 0.000305 MiB name: 0000:d0:02.7_qat 00:07:11.384 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:11.384 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:11.384 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:11.384 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:11.384 end memzones------- 00:07:11.648 07:42:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:11.648 heap id: 0 total size: 814.000000 MiB number of busy elements: 495 number of free elements: 14 00:07:11.648 list of free elements. size: 11.842529 MiB 00:07:11.648 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:11.648 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:11.648 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:11.648 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:11.648 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:11.648 element at address: 0x200007000000 with size: 0.991760 MiB 00:07:11.648 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:11.648 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:11.648 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:07:11.648 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:11.648 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:11.648 element at address: 0x200000800000 with size: 0.486145 MiB 00:07:11.648 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:11.648 element at address: 0x200027e00000 with size: 0.399414 MiB 00:07:11.648 list of standard malloc elements. size: 199.872620 MiB 00:07:11.648 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:11.648 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:11.648 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:11.648 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:11.648 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:11.648 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:11.648 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:11.648 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:11.648 element at address: 0x20000033b340 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000033e8c0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x200000341e40 with size: 0.004395 MiB 00:07:11.648 element at address: 0x2000003453c0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x200000348940 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000034bec0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000034f440 with size: 0.004395 MiB 00:07:11.648 element at address: 0x2000003529c0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x200000355f40 with size: 0.004395 MiB 00:07:11.648 element at address: 0x2000003594c0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000035ca40 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000035ffc0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x200000363540 with size: 0.004395 MiB 00:07:11.648 element at address: 0x200000366ac0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000036a040 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000036d5c0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x200000370b40 with size: 0.004395 MiB 00:07:11.648 element at address: 0x2000003740c0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x200000377640 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000037abc0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x20000037e140 with size: 0.004395 MiB 00:07:11.648 element at address: 0x2000003816c0 with size: 0.004395 MiB 00:07:11.648 element at address: 0x200000384c40 with size: 0.004395 MiB 00:07:11.648 element at address: 0x2000003881c0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x20000038b740 with size: 0.004395 MiB 00:07:11.649 element at address: 0x20000038ecc0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x200000392240 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003957c0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x200000398d40 with size: 0.004395 MiB 00:07:11.649 element at address: 0x20000039c2c0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x20000039f840 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003a2dc0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003a6340 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003a98c0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003ace40 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003b03c0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003b3940 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003b6ec0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003ba440 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003bd9c0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003c0f40 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003c44c0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003c7a40 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003cafc0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003ce540 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003d1ac0 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003d5040 with size: 0.004395 MiB 00:07:11.649 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:11.649 element at address: 0x200000339240 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000033a2c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000033c7c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000033d840 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000033fd40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000340dc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003432c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000344340 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000346840 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003478c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000349dc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000034ae40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000034d340 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000034e3c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003508c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000351940 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000353e40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000354ec0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003573c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000358440 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000035a940 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000035b9c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000035dec0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000035ef40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000361440 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003624c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003649c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000365a40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000367f40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000368fc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000036b4c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000036c540 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000036ea40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000036fac0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000371fc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000373040 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000375540 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003765c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000378ac0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000379b40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000037c040 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000037d0c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000037f5c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000380640 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000382b40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000383bc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003860c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000387140 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000389640 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000038a6c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000038cbc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000038dc40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000390140 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003911c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003936c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000394740 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000396c40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000397cc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000039a1c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000039b240 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000039d740 with size: 0.004028 MiB 00:07:11.649 element at address: 0x20000039e7c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003a0cc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003a1d40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003a4240 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003a52c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003a77c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003a8840 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003aad40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003abdc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003ae2c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003af340 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003b1840 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003b28c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003b4dc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003b5e40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003b8340 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003b93c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003bb8c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003bc940 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003bee40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003bfec0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003c23c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003c3440 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003c5940 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003c69c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003c8ec0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003c9f40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003cc440 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003cd4c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003cf9c0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003d0a40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003d2f40 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003d3fc0 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:11.649 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:11.649 element at address: 0x200000200000 with size: 0.000305 MiB 00:07:11.649 element at address: 0x20000020ea00 with size: 0.000305 MiB 00:07:11.649 element at address: 0x200000200140 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200200 with size: 0.000183 MiB 00:07:11.649 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200380 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200440 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200500 with size: 0.000183 MiB 00:07:11.649 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200680 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200740 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200800 with size: 0.000183 MiB 00:07:11.649 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200980 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200a40 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200b00 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200c80 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200d40 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000200e00 with size: 0.000183 MiB 00:07:11.649 element at address: 0x2000002090c0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209180 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209240 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209300 with size: 0.000183 MiB 00:07:11.649 element at address: 0x2000002093c0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209480 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209540 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209600 with size: 0.000183 MiB 00:07:11.649 element at address: 0x2000002096c0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209780 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209840 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209900 with size: 0.000183 MiB 00:07:11.649 element at address: 0x2000002099c0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209a80 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209b40 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209c00 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209cc0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209d80 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209e40 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209f00 with size: 0.000183 MiB 00:07:11.649 element at address: 0x200000209fc0 with size: 0.000183 MiB 00:07:11.649 element at address: 0x20000020a080 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a140 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a200 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a2c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a380 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a440 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a500 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a5c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a680 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a740 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a800 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a8c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020a980 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020aa40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ab00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020abc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ac80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ad40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ae00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020aec0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020af80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b040 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b100 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b1c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b280 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b340 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b400 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b4c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b580 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b640 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b700 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b7c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b880 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020b940 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ba00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020bac0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020bb80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020bc40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020bd00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020bdc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020be80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020bf40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c000 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c0c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c180 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c240 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c300 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c3c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c480 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c540 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c600 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c6c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c780 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c840 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c900 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020c9c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ca80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020cb40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020cc00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ccc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020cd80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ce40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020cf00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020cfc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d080 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d140 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d200 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d2c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d380 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d440 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d500 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d5c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d680 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d740 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d800 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d8c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020d980 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020da40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020db00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020dbc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020dc80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020dd40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020de00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020dec0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020df80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e040 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e100 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e1c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e280 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e340 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e400 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e4c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e580 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e640 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e700 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e7c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e880 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020e940 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020eb40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ec00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ecc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ed80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ee40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ef00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020efc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f080 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f140 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f200 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f2c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f380 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f440 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f500 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f5c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f680 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f740 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f800 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f8c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020f980 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020fa40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020fb00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020fbc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020fc80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020fd40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020fe00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020fec0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x20000020ff80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210040 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210100 with size: 0.000183 MiB 00:07:11.650 element at address: 0x2000002101c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210280 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210340 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210400 with size: 0.000183 MiB 00:07:11.650 element at address: 0x2000002104c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210580 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210640 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210700 with size: 0.000183 MiB 00:07:11.650 element at address: 0x2000002107c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210880 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210940 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210a00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210ac0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000210cc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000214f80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235240 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235300 with size: 0.000183 MiB 00:07:11.650 element at address: 0x2000002353c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235480 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235540 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235600 with size: 0.000183 MiB 00:07:11.650 element at address: 0x2000002356c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235780 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235840 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235900 with size: 0.000183 MiB 00:07:11.650 element at address: 0x2000002359c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235a80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235b40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235c00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235cc0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235d80 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235e40 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000235f00 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000236100 with size: 0.000183 MiB 00:07:11.650 element at address: 0x2000002361c0 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000236280 with size: 0.000183 MiB 00:07:11.650 element at address: 0x200000236340 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236400 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000002364c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236580 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236640 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236700 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000002367c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236880 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236940 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236a00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236ac0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236b80 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236c40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000236d00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000338f00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000338fc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000033c540 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000033fac0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000343040 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003465c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000349b40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000034d0c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000350640 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000353bc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000357140 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000035a6c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000035dc40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003611c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000364740 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000367cc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000036b240 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000036e7c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000371d40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003752c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000378840 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000037bdc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000037f340 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003828c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000385e40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003893c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000038c940 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000038fec0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000393440 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003969c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200000399f40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000039d4c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003a0a40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003a3fc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003a7540 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003aaac0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003ae040 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003b4b40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003b80c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003bb640 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003bebc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003c2140 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003c56c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003c8c40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003cc1c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003cf740 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003d2cc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000003d6840 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087c740 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087c800 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087c980 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e66400 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e664c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d0c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:11.651 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:11.651 list of memzone associated elements. size: 602.284851 MiB 00:07:11.651 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:11.651 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:11.651 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:11.651 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:11.651 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:11.651 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1549190_0 00:07:11.651 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:11.651 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1549190_0 00:07:11.651 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:11.651 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1549190_0 00:07:11.651 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:11.651 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:11.651 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:11.651 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:11.651 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:11.651 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1549190 00:07:11.651 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:11.651 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1549190 00:07:11.651 element at address: 0x200000236dc0 with size: 1.008118 MiB 00:07:11.651 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1549190 00:07:11.651 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:11.651 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:11.651 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:11.651 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:11.651 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:11.652 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:11.652 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:11.652 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:11.652 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:11.652 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1549190 00:07:11.652 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:11.652 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1549190 00:07:11.652 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:11.652 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1549190 00:07:11.652 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:11.652 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1549190 00:07:11.652 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:11.652 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1549190 00:07:11.652 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:11.652 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:11.652 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:11.652 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:11.652 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:11.652 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:11.652 element at address: 0x200000215040 with size: 0.125488 MiB 00:07:11.652 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1549190 00:07:11.652 element at address: 0x200000200ec0 with size: 0.031738 MiB 00:07:11.652 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:11.652 element at address: 0x200027e66580 with size: 0.023743 MiB 00:07:11.652 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:11.652 element at address: 0x200000210d80 with size: 0.016113 MiB 00:07:11.652 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1549190 00:07:11.652 element at address: 0x200027e6c6c0 with size: 0.002441 MiB 00:07:11.652 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:11.652 element at address: 0x2000003d6300 with size: 0.001282 MiB 00:07:11.652 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:11.652 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.0_qat 00:07:11.652 element at address: 0x2000003d2d80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.1_qat 00:07:11.652 element at address: 0x2000003cf800 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.2_qat 00:07:11.652 element at address: 0x2000003cc280 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.3_qat 00:07:11.652 element at address: 0x2000003c8d00 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.4_qat 00:07:11.652 element at address: 0x2000003c5780 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.5_qat 00:07:11.652 element at address: 0x2000003c2200 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.6_qat 00:07:11.652 element at address: 0x2000003bec80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.7_qat 00:07:11.652 element at address: 0x2000003bb700 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.0_qat 00:07:11.652 element at address: 0x2000003b8180 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.1_qat 00:07:11.652 element at address: 0x2000003b4c00 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.2_qat 00:07:11.652 element at address: 0x2000003b1680 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.3_qat 00:07:11.652 element at address: 0x2000003ae100 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.4_qat 00:07:11.652 element at address: 0x2000003aab80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.5_qat 00:07:11.652 element at address: 0x2000003a7600 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.6_qat 00:07:11.652 element at address: 0x2000003a4080 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.7_qat 00:07:11.652 element at address: 0x2000003a0b00 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.0_qat 00:07:11.652 element at address: 0x20000039d580 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.1_qat 00:07:11.652 element at address: 0x20000039a000 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.2_qat 00:07:11.652 element at address: 0x200000396a80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.3_qat 00:07:11.652 element at address: 0x200000393500 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.4_qat 00:07:11.652 element at address: 0x20000038ff80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.5_qat 00:07:11.652 element at address: 0x20000038ca00 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.6_qat 00:07:11.652 element at address: 0x200000389480 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.7_qat 00:07:11.652 element at address: 0x200000385f00 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.0_qat 00:07:11.652 element at address: 0x200000382980 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.1_qat 00:07:11.652 element at address: 0x20000037f400 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.2_qat 00:07:11.652 element at address: 0x20000037be80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.3_qat 00:07:11.652 element at address: 0x200000378900 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.4_qat 00:07:11.652 element at address: 0x200000375380 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.5_qat 00:07:11.652 element at address: 0x200000371e00 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.6_qat 00:07:11.652 element at address: 0x20000036e880 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.7_qat 00:07:11.652 element at address: 0x20000036b300 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.0_qat 00:07:11.652 element at address: 0x200000367d80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.1_qat 00:07:11.652 element at address: 0x200000364800 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.2_qat 00:07:11.652 element at address: 0x200000361280 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.3_qat 00:07:11.652 element at address: 0x20000035dd00 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.4_qat 00:07:11.652 element at address: 0x20000035a780 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.5_qat 00:07:11.652 element at address: 0x200000357200 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.6_qat 00:07:11.652 element at address: 0x200000353c80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.7_qat 00:07:11.652 element at address: 0x200000350700 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.0_qat 00:07:11.652 element at address: 0x20000034d180 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.1_qat 00:07:11.652 element at address: 0x200000349c00 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.2_qat 00:07:11.652 element at address: 0x200000346680 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.3_qat 00:07:11.652 element at address: 0x200000343100 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.4_qat 00:07:11.652 element at address: 0x20000033fb80 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.5_qat 00:07:11.652 element at address: 0x20000033c600 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.6_qat 00:07:11.652 element at address: 0x200000339080 with size: 0.000427 MiB 00:07:11.652 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.7_qat 00:07:11.652 element at address: 0x2000003d6900 with size: 0.000305 MiB 00:07:11.652 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:11.652 element at address: 0x200000235fc0 with size: 0.000305 MiB 00:07:11.652 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1549190 00:07:11.652 element at address: 0x200000210b80 with size: 0.000305 MiB 00:07:11.652 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1549190 00:07:11.652 element at address: 0x200027e6d180 with size: 0.000305 MiB 00:07:11.652 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:11.652 element at address: 0x2000003d6240 with size: 0.000183 MiB 00:07:11.652 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:11.652 07:42:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:11.652 07:42:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1549190 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 1549190 ']' 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 1549190 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1549190 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1549190' 00:07:11.652 killing process with pid 1549190 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 1549190 00:07:11.652 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 1549190 00:07:11.913 00:07:11.913 real 0m1.433s 00:07:11.913 user 0m1.614s 00:07:11.913 sys 0m0.408s 00:07:11.913 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:11.913 07:42:56 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:11.913 ************************************ 00:07:11.913 END TEST dpdk_mem_utility 00:07:11.913 ************************************ 00:07:11.913 07:42:56 -- common/autotest_common.sh@1142 -- # return 0 00:07:11.913 07:42:56 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:11.913 07:42:56 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:11.913 07:42:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.913 07:42:56 -- common/autotest_common.sh@10 -- # set +x 00:07:11.913 ************************************ 00:07:11.913 START TEST event 00:07:11.913 ************************************ 00:07:11.913 07:42:56 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:11.913 * Looking for test storage... 00:07:11.913 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:11.913 07:42:56 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:11.913 07:42:56 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:11.913 07:42:56 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:11.913 07:42:56 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:11.913 07:42:56 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.913 07:42:56 event -- common/autotest_common.sh@10 -- # set +x 00:07:12.174 ************************************ 00:07:12.174 START TEST event_perf 00:07:12.174 ************************************ 00:07:12.174 07:42:56 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:12.174 Running I/O for 1 seconds...[2024-07-15 07:42:56.721874] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:12.174 [2024-07-15 07:42:56.721991] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1549544 ] 00:07:12.174 [2024-07-15 07:42:56.861847] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:12.435 [2024-07-15 07:42:56.940425] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:12.435 [2024-07-15 07:42:56.940577] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:12.435 [2024-07-15 07:42:56.940743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.435 [2024-07-15 07:42:56.940743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:13.375 Running I/O for 1 seconds... 00:07:13.375 lcore 0: 79980 00:07:13.375 lcore 1: 79983 00:07:13.375 lcore 2: 79987 00:07:13.375 lcore 3: 79984 00:07:13.375 done. 00:07:13.375 00:07:13.375 real 0m1.301s 00:07:13.375 user 0m4.148s 00:07:13.375 sys 0m0.144s 00:07:13.375 07:42:57 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:13.375 07:42:57 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:13.375 ************************************ 00:07:13.375 END TEST event_perf 00:07:13.375 ************************************ 00:07:13.375 07:42:58 event -- common/autotest_common.sh@1142 -- # return 0 00:07:13.375 07:42:58 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:13.375 07:42:58 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:13.375 07:42:58 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:13.375 07:42:58 event -- common/autotest_common.sh@10 -- # set +x 00:07:13.375 ************************************ 00:07:13.375 START TEST event_reactor 00:07:13.375 ************************************ 00:07:13.375 07:42:58 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:13.375 [2024-07-15 07:42:58.091594] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:13.376 [2024-07-15 07:42:58.091663] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1549855 ] 00:07:13.636 [2024-07-15 07:42:58.180857] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.636 [2024-07-15 07:42:58.244564] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.577 test_start 00:07:14.577 oneshot 00:07:14.577 tick 100 00:07:14.577 tick 100 00:07:14.577 tick 250 00:07:14.577 tick 100 00:07:14.577 tick 100 00:07:14.577 tick 250 00:07:14.577 tick 100 00:07:14.577 tick 500 00:07:14.577 tick 100 00:07:14.577 tick 100 00:07:14.577 tick 250 00:07:14.577 tick 100 00:07:14.577 tick 100 00:07:14.577 test_end 00:07:14.577 00:07:14.577 real 0m1.229s 00:07:14.577 user 0m1.143s 00:07:14.577 sys 0m0.082s 00:07:14.577 07:42:59 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.577 07:42:59 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:14.577 ************************************ 00:07:14.577 END TEST event_reactor 00:07:14.577 ************************************ 00:07:14.577 07:42:59 event -- common/autotest_common.sh@1142 -- # return 0 00:07:14.577 07:42:59 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:14.577 07:42:59 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:14.577 07:42:59 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.577 07:42:59 event -- common/autotest_common.sh@10 -- # set +x 00:07:14.838 ************************************ 00:07:14.838 START TEST event_reactor_perf 00:07:14.838 ************************************ 00:07:14.838 07:42:59 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:14.838 [2024-07-15 07:42:59.392268] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:14.838 [2024-07-15 07:42:59.392351] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1549925 ] 00:07:14.838 [2024-07-15 07:42:59.482953] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.838 [2024-07-15 07:42:59.559282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.223 test_start 00:07:16.223 test_end 00:07:16.223 Performance: 400313 events per second 00:07:16.223 00:07:16.223 real 0m1.244s 00:07:16.223 user 0m1.147s 00:07:16.223 sys 0m0.091s 00:07:16.223 07:43:00 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.223 07:43:00 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:16.223 ************************************ 00:07:16.223 END TEST event_reactor_perf 00:07:16.223 ************************************ 00:07:16.223 07:43:00 event -- common/autotest_common.sh@1142 -- # return 0 00:07:16.223 07:43:00 event -- event/event.sh@49 -- # uname -s 00:07:16.223 07:43:00 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:16.223 07:43:00 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:16.223 07:43:00 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:16.223 07:43:00 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.223 07:43:00 event -- common/autotest_common.sh@10 -- # set +x 00:07:16.223 ************************************ 00:07:16.223 START TEST event_scheduler 00:07:16.223 ************************************ 00:07:16.223 07:43:00 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:16.223 * Looking for test storage... 00:07:16.223 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:16.223 07:43:00 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:16.223 07:43:00 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1550254 00:07:16.223 07:43:00 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:16.223 07:43:00 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:16.223 07:43:00 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1550254 00:07:16.223 07:43:00 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 1550254 ']' 00:07:16.223 07:43:00 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:16.223 07:43:00 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:16.223 07:43:00 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:16.223 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:16.223 07:43:00 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:16.223 07:43:00 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:16.223 [2024-07-15 07:43:00.842016] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:16.223 [2024-07-15 07:43:00.842069] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1550254 ] 00:07:16.483 [2024-07-15 07:43:00.989861] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:16.484 [2024-07-15 07:43:01.158370] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.484 [2024-07-15 07:43:01.158652] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.484 [2024-07-15 07:43:01.158807] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:16.484 [2024-07-15 07:43:01.158826] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:07:17.057 07:43:01 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:17.057 [2024-07-15 07:43:01.701861] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:17.057 [2024-07-15 07:43:01.701908] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:17.057 [2024-07-15 07:43:01.701946] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:17.057 [2024-07-15 07:43:01.701973] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:17.057 [2024-07-15 07:43:01.702002] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.057 07:43:01 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:17.057 [2024-07-15 07:43:01.807531] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.057 07:43:01 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.057 07:43:01 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 ************************************ 00:07:17.318 START TEST scheduler_create_thread 00:07:17.318 ************************************ 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 2 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 3 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 4 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 5 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 6 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 7 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 8 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.318 9 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.318 07:43:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:17.891 10 00:07:17.891 07:43:02 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.891 07:43:02 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:17.891 07:43:02 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.891 07:43:02 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.273 07:43:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.273 07:43:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:19.273 07:43:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:19.273 07:43:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.273 07:43:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:19.844 07:43:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:19.844 07:43:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:19.844 07:43:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:19.844 07:43:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:20.785 07:43:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:20.785 07:43:05 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:20.785 07:43:05 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:20.785 07:43:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:20.785 07:43:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.356 07:43:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:21.356 00:07:21.356 real 0m4.226s 00:07:21.356 user 0m0.025s 00:07:21.356 sys 0m0.006s 00:07:21.356 07:43:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:21.356 07:43:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:21.356 ************************************ 00:07:21.357 END TEST scheduler_create_thread 00:07:21.357 ************************************ 00:07:21.357 07:43:06 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:07:21.357 07:43:06 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:21.357 07:43:06 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1550254 00:07:21.357 07:43:06 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 1550254 ']' 00:07:21.357 07:43:06 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 1550254 00:07:21.617 07:43:06 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:07:21.617 07:43:06 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:21.617 07:43:06 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1550254 00:07:21.617 07:43:06 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:07:21.617 07:43:06 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:07:21.617 07:43:06 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1550254' 00:07:21.617 killing process with pid 1550254 00:07:21.617 07:43:06 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 1550254 00:07:21.617 07:43:06 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 1550254 00:07:21.617 [2024-07-15 07:43:06.352367] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:22.251 00:07:22.251 real 0m5.979s 00:07:22.251 user 0m12.712s 00:07:22.251 sys 0m0.464s 00:07:22.251 07:43:06 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.251 07:43:06 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:22.251 ************************************ 00:07:22.251 END TEST event_scheduler 00:07:22.251 ************************************ 00:07:22.251 07:43:06 event -- common/autotest_common.sh@1142 -- # return 0 00:07:22.251 07:43:06 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:22.251 07:43:06 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:22.251 07:43:06 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:22.251 07:43:06 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.251 07:43:06 event -- common/autotest_common.sh@10 -- # set +x 00:07:22.251 ************************************ 00:07:22.251 START TEST app_repeat 00:07:22.251 ************************************ 00:07:22.251 07:43:06 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1551254 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1551254' 00:07:22.251 Process app_repeat pid: 1551254 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:22.251 spdk_app_start Round 0 00:07:22.251 07:43:06 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1551254 /var/tmp/spdk-nbd.sock 00:07:22.251 07:43:06 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1551254 ']' 00:07:22.251 07:43:06 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:22.251 07:43:06 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:22.251 07:43:06 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:22.251 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:22.251 07:43:06 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:22.251 07:43:06 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:22.251 [2024-07-15 07:43:06.799203] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:22.251 [2024-07-15 07:43:06.799266] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1551254 ] 00:07:22.251 [2024-07-15 07:43:06.892288] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:22.251 [2024-07-15 07:43:06.969623] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.251 [2024-07-15 07:43:06.969628] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.189 07:43:07 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:23.189 07:43:07 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:23.189 07:43:07 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.190 Malloc0 00:07:23.190 07:43:07 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.449 Malloc1 00:07:23.449 07:43:08 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.449 07:43:08 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:23.709 /dev/nbd0 00:07:23.709 07:43:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:23.709 07:43:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:23.709 1+0 records in 00:07:23.709 1+0 records out 00:07:23.709 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218077 s, 18.8 MB/s 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:23.709 07:43:08 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:23.709 07:43:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.709 07:43:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.709 07:43:08 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:23.969 /dev/nbd1 00:07:23.969 07:43:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:23.969 07:43:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:23.969 1+0 records in 00:07:23.969 1+0 records out 00:07:23.969 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240978 s, 17.0 MB/s 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:23.969 07:43:08 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:23.969 07:43:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.969 07:43:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.969 07:43:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:23.969 07:43:08 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.969 07:43:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:24.230 { 00:07:24.230 "nbd_device": "/dev/nbd0", 00:07:24.230 "bdev_name": "Malloc0" 00:07:24.230 }, 00:07:24.230 { 00:07:24.230 "nbd_device": "/dev/nbd1", 00:07:24.230 "bdev_name": "Malloc1" 00:07:24.230 } 00:07:24.230 ]' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:24.230 { 00:07:24.230 "nbd_device": "/dev/nbd0", 00:07:24.230 "bdev_name": "Malloc0" 00:07:24.230 }, 00:07:24.230 { 00:07:24.230 "nbd_device": "/dev/nbd1", 00:07:24.230 "bdev_name": "Malloc1" 00:07:24.230 } 00:07:24.230 ]' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:24.230 /dev/nbd1' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:24.230 /dev/nbd1' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:24.230 256+0 records in 00:07:24.230 256+0 records out 00:07:24.230 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125198 s, 83.8 MB/s 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:24.230 256+0 records in 00:07:24.230 256+0 records out 00:07:24.230 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.014619 s, 71.7 MB/s 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:24.230 256+0 records in 00:07:24.230 256+0 records out 00:07:24.230 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0156212 s, 67.1 MB/s 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.230 07:43:08 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.490 07:43:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:24.750 07:43:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:24.750 07:43:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.751 07:43:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:25.010 07:43:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:25.011 07:43:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:25.011 07:43:09 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:25.011 07:43:09 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:25.011 07:43:09 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:25.270 07:43:09 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:25.270 [2024-07-15 07:43:09.929825] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:25.270 [2024-07-15 07:43:09.991525] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.270 [2024-07-15 07:43:09.991530] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.270 [2024-07-15 07:43:10.023063] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:25.270 [2024-07-15 07:43:10.023098] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:28.568 07:43:12 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:28.568 07:43:12 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:28.568 spdk_app_start Round 1 00:07:28.568 07:43:12 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1551254 /var/tmp/spdk-nbd.sock 00:07:28.568 07:43:12 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1551254 ']' 00:07:28.568 07:43:12 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:28.568 07:43:12 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:28.568 07:43:12 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:28.568 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:28.568 07:43:12 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:28.568 07:43:12 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:28.568 07:43:13 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:28.568 07:43:13 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:28.568 07:43:13 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:28.568 Malloc0 00:07:28.568 07:43:13 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:28.828 Malloc1 00:07:28.828 07:43:13 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:28.828 07:43:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:29.087 /dev/nbd0 00:07:29.087 07:43:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:29.087 07:43:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.087 1+0 records in 00:07:29.087 1+0 records out 00:07:29.087 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000302291 s, 13.5 MB/s 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.087 07:43:13 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:29.088 07:43:13 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.088 07:43:13 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:29.088 07:43:13 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:29.088 07:43:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.088 07:43:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.088 07:43:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:29.347 /dev/nbd1 00:07:29.347 07:43:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:29.347 07:43:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.347 1+0 records in 00:07:29.347 1+0 records out 00:07:29.347 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270114 s, 15.2 MB/s 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:29.347 07:43:13 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:29.347 07:43:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.347 07:43:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.347 07:43:13 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:29.347 07:43:13 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.347 07:43:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:29.607 { 00:07:29.607 "nbd_device": "/dev/nbd0", 00:07:29.607 "bdev_name": "Malloc0" 00:07:29.607 }, 00:07:29.607 { 00:07:29.607 "nbd_device": "/dev/nbd1", 00:07:29.607 "bdev_name": "Malloc1" 00:07:29.607 } 00:07:29.607 ]' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:29.607 { 00:07:29.607 "nbd_device": "/dev/nbd0", 00:07:29.607 "bdev_name": "Malloc0" 00:07:29.607 }, 00:07:29.607 { 00:07:29.607 "nbd_device": "/dev/nbd1", 00:07:29.607 "bdev_name": "Malloc1" 00:07:29.607 } 00:07:29.607 ]' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:29.607 /dev/nbd1' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:29.607 /dev/nbd1' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:29.607 256+0 records in 00:07:29.607 256+0 records out 00:07:29.607 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125059 s, 83.8 MB/s 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:29.607 256+0 records in 00:07:29.607 256+0 records out 00:07:29.607 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0151162 s, 69.4 MB/s 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:29.607 256+0 records in 00:07:29.607 256+0 records out 00:07:29.607 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0164356 s, 63.8 MB/s 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.607 07:43:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.867 07:43:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.127 07:43:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:30.387 07:43:14 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:30.387 07:43:14 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:30.647 07:43:15 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:30.647 [2024-07-15 07:43:15.292051] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:30.647 [2024-07-15 07:43:15.354161] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:30.647 [2024-07-15 07:43:15.354165] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.647 [2024-07-15 07:43:15.385324] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:30.647 [2024-07-15 07:43:15.385356] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:33.942 07:43:18 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:33.942 07:43:18 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:33.942 spdk_app_start Round 2 00:07:33.942 07:43:18 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1551254 /var/tmp/spdk-nbd.sock 00:07:33.942 07:43:18 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1551254 ']' 00:07:33.943 07:43:18 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:33.943 07:43:18 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:33.943 07:43:18 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:33.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:33.943 07:43:18 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:33.943 07:43:18 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:33.943 07:43:18 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:33.943 07:43:18 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:33.943 07:43:18 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:33.943 Malloc0 00:07:33.943 07:43:18 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:34.203 Malloc1 00:07:34.203 07:43:18 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.203 07:43:18 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:34.463 /dev/nbd0 00:07:34.463 07:43:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:34.463 07:43:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:34.463 1+0 records in 00:07:34.463 1+0 records out 00:07:34.463 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263934 s, 15.5 MB/s 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:34.463 07:43:19 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:34.463 07:43:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.463 07:43:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.463 07:43:19 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:34.722 /dev/nbd1 00:07:34.722 07:43:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:34.722 07:43:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:34.722 07:43:19 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:34.722 07:43:19 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:34.722 07:43:19 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:34.722 07:43:19 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:34.722 07:43:19 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:34.722 07:43:19 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:34.723 07:43:19 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:34.723 07:43:19 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:34.723 07:43:19 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:34.723 1+0 records in 00:07:34.723 1+0 records out 00:07:34.723 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027799 s, 14.7 MB/s 00:07:34.723 07:43:19 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:34.723 07:43:19 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:34.723 07:43:19 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:34.723 07:43:19 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:34.723 07:43:19 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:34.723 07:43:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.723 07:43:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:34.723 07:43:19 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:34.723 07:43:19 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.723 07:43:19 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:34.723 07:43:19 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:34.723 { 00:07:34.723 "nbd_device": "/dev/nbd0", 00:07:34.723 "bdev_name": "Malloc0" 00:07:34.723 }, 00:07:34.723 { 00:07:34.723 "nbd_device": "/dev/nbd1", 00:07:34.723 "bdev_name": "Malloc1" 00:07:34.723 } 00:07:34.723 ]' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:34.983 { 00:07:34.983 "nbd_device": "/dev/nbd0", 00:07:34.983 "bdev_name": "Malloc0" 00:07:34.983 }, 00:07:34.983 { 00:07:34.983 "nbd_device": "/dev/nbd1", 00:07:34.983 "bdev_name": "Malloc1" 00:07:34.983 } 00:07:34.983 ]' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:34.983 /dev/nbd1' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:34.983 /dev/nbd1' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:34.983 256+0 records in 00:07:34.983 256+0 records out 00:07:34.983 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125332 s, 83.7 MB/s 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:34.983 256+0 records in 00:07:34.983 256+0 records out 00:07:34.983 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0145275 s, 72.2 MB/s 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:34.983 256+0 records in 00:07:34.983 256+0 records out 00:07:34.983 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0155867 s, 67.3 MB/s 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:34.983 07:43:19 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:35.244 07:43:19 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:35.505 07:43:20 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:35.766 07:43:20 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:35.766 07:43:20 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:36.025 07:43:20 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:36.025 [2024-07-15 07:43:20.648644] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:36.025 [2024-07-15 07:43:20.709955] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:36.025 [2024-07-15 07:43:20.709960] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.025 [2024-07-15 07:43:20.740406] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:36.025 [2024-07-15 07:43:20.740438] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:39.323 07:43:23 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1551254 /var/tmp/spdk-nbd.sock 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1551254 ']' 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:39.323 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:39.323 07:43:23 event.app_repeat -- event/event.sh@39 -- # killprocess 1551254 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 1551254 ']' 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 1551254 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1551254 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1551254' 00:07:39.323 killing process with pid 1551254 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@967 -- # kill 1551254 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@972 -- # wait 1551254 00:07:39.323 spdk_app_start is called in Round 0. 00:07:39.323 Shutdown signal received, stop current app iteration 00:07:39.323 Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 reinitialization... 00:07:39.323 spdk_app_start is called in Round 1. 00:07:39.323 Shutdown signal received, stop current app iteration 00:07:39.323 Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 reinitialization... 00:07:39.323 spdk_app_start is called in Round 2. 00:07:39.323 Shutdown signal received, stop current app iteration 00:07:39.323 Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 reinitialization... 00:07:39.323 spdk_app_start is called in Round 3. 00:07:39.323 Shutdown signal received, stop current app iteration 00:07:39.323 07:43:23 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:39.323 07:43:23 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:39.323 00:07:39.323 real 0m17.156s 00:07:39.323 user 0m37.998s 00:07:39.323 sys 0m2.533s 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:39.323 07:43:23 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:39.323 ************************************ 00:07:39.323 END TEST app_repeat 00:07:39.323 ************************************ 00:07:39.323 07:43:23 event -- common/autotest_common.sh@1142 -- # return 0 00:07:39.323 07:43:23 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:39.323 00:07:39.323 real 0m27.393s 00:07:39.323 user 0m57.338s 00:07:39.323 sys 0m3.636s 00:07:39.323 07:43:23 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:39.323 07:43:23 event -- common/autotest_common.sh@10 -- # set +x 00:07:39.323 ************************************ 00:07:39.323 END TEST event 00:07:39.323 ************************************ 00:07:39.323 07:43:23 -- common/autotest_common.sh@1142 -- # return 0 00:07:39.323 07:43:23 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:39.323 07:43:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:39.323 07:43:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.323 07:43:23 -- common/autotest_common.sh@10 -- # set +x 00:07:39.323 ************************************ 00:07:39.323 START TEST thread 00:07:39.323 ************************************ 00:07:39.323 07:43:24 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:39.584 * Looking for test storage... 00:07:39.584 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:39.584 07:43:24 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:39.584 07:43:24 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:39.584 07:43:24 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.584 07:43:24 thread -- common/autotest_common.sh@10 -- # set +x 00:07:39.584 ************************************ 00:07:39.584 START TEST thread_poller_perf 00:07:39.584 ************************************ 00:07:39.584 07:43:24 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:39.584 [2024-07-15 07:43:24.178996] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:39.584 [2024-07-15 07:43:24.179056] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1554516 ] 00:07:39.584 [2024-07-15 07:43:24.253355] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.584 [2024-07-15 07:43:24.315262] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.584 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:40.967 ====================================== 00:07:40.967 busy:2607286630 (cyc) 00:07:40.967 total_run_count: 311000 00:07:40.967 tsc_hz: 2600000000 (cyc) 00:07:40.967 ====================================== 00:07:40.967 poller_cost: 8383 (cyc), 3224 (nsec) 00:07:40.967 00:07:40.967 real 0m1.221s 00:07:40.967 user 0m1.138s 00:07:40.967 sys 0m0.075s 00:07:40.967 07:43:25 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:40.967 07:43:25 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:40.967 ************************************ 00:07:40.967 END TEST thread_poller_perf 00:07:40.967 ************************************ 00:07:40.967 07:43:25 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:40.967 07:43:25 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:40.967 07:43:25 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:40.967 07:43:25 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.967 07:43:25 thread -- common/autotest_common.sh@10 -- # set +x 00:07:40.967 ************************************ 00:07:40.967 START TEST thread_poller_perf 00:07:40.967 ************************************ 00:07:40.967 07:43:25 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:40.967 [2024-07-15 07:43:25.473692] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:40.967 [2024-07-15 07:43:25.473788] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1554832 ] 00:07:40.967 [2024-07-15 07:43:25.564150] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.967 [2024-07-15 07:43:25.632135] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.967 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:42.352 ====================================== 00:07:42.352 busy:2602080660 (cyc) 00:07:42.352 total_run_count: 4084000 00:07:42.352 tsc_hz: 2600000000 (cyc) 00:07:42.352 ====================================== 00:07:42.352 poller_cost: 637 (cyc), 245 (nsec) 00:07:42.352 00:07:42.352 real 0m1.236s 00:07:42.352 user 0m1.139s 00:07:42.352 sys 0m0.093s 00:07:42.352 07:43:26 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:42.352 07:43:26 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:42.352 ************************************ 00:07:42.352 END TEST thread_poller_perf 00:07:42.352 ************************************ 00:07:42.352 07:43:26 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:42.353 07:43:26 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:42.353 00:07:42.353 real 0m2.699s 00:07:42.353 user 0m2.370s 00:07:42.353 sys 0m0.335s 00:07:42.353 07:43:26 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:42.353 07:43:26 thread -- common/autotest_common.sh@10 -- # set +x 00:07:42.353 ************************************ 00:07:42.353 END TEST thread 00:07:42.353 ************************************ 00:07:42.353 07:43:26 -- common/autotest_common.sh@1142 -- # return 0 00:07:42.353 07:43:26 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:42.353 07:43:26 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:42.353 07:43:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:42.353 07:43:26 -- common/autotest_common.sh@10 -- # set +x 00:07:42.353 ************************************ 00:07:42.353 START TEST accel 00:07:42.353 ************************************ 00:07:42.353 07:43:26 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:42.353 * Looking for test storage... 00:07:42.353 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:42.353 07:43:26 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:42.353 07:43:26 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:42.353 07:43:26 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:42.353 07:43:26 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1555015 00:07:42.353 07:43:26 accel -- accel/accel.sh@63 -- # waitforlisten 1555015 00:07:42.353 07:43:26 accel -- common/autotest_common.sh@829 -- # '[' -z 1555015 ']' 00:07:42.353 07:43:26 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:42.353 07:43:26 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:42.353 07:43:26 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:42.353 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:42.353 07:43:26 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:42.353 07:43:26 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:42.353 07:43:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.353 07:43:26 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:42.353 07:43:26 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.353 07:43:26 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.353 07:43:26 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.353 07:43:26 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.353 07:43:26 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:42.353 07:43:26 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:42.353 07:43:26 accel -- accel/accel.sh@41 -- # jq -r . 00:07:42.353 [2024-07-15 07:43:26.977307] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:42.353 [2024-07-15 07:43:26.977375] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555015 ] 00:07:42.353 [2024-07-15 07:43:27.069141] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.614 [2024-07-15 07:43:27.137020] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.184 07:43:27 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:43.184 07:43:27 accel -- common/autotest_common.sh@862 -- # return 0 00:07:43.184 07:43:27 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:43.184 07:43:27 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:43.184 07:43:27 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:43.184 07:43:27 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:43.184 07:43:27 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:43.184 07:43:27 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:43.184 07:43:27 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:43.184 07:43:27 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:43.184 07:43:27 accel -- common/autotest_common.sh@10 -- # set +x 00:07:43.184 07:43:27 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.184 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.184 07:43:27 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # IFS== 00:07:43.184 07:43:27 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:43.185 07:43:27 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:43.185 07:43:27 accel -- accel/accel.sh@75 -- # killprocess 1555015 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@948 -- # '[' -z 1555015 ']' 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@952 -- # kill -0 1555015 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@953 -- # uname 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1555015 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1555015' 00:07:43.185 killing process with pid 1555015 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@967 -- # kill 1555015 00:07:43.185 07:43:27 accel -- common/autotest_common.sh@972 -- # wait 1555015 00:07:43.445 07:43:28 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:43.445 07:43:28 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:43.445 07:43:28 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:43.445 07:43:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:43.445 07:43:28 accel -- common/autotest_common.sh@10 -- # set +x 00:07:43.445 07:43:28 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:43.445 07:43:28 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:43.445 07:43:28 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:43.445 07:43:28 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:43.705 07:43:28 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:43.705 07:43:28 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:43.705 07:43:28 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:43.705 07:43:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:43.705 07:43:28 accel -- common/autotest_common.sh@10 -- # set +x 00:07:43.705 ************************************ 00:07:43.705 START TEST accel_missing_filename 00:07:43.705 ************************************ 00:07:43.705 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:43.705 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:43.705 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:43.705 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:43.705 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:43.705 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:43.705 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:43.705 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:43.705 07:43:28 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:43.705 07:43:28 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:43.705 07:43:28 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:43.705 07:43:28 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:43.705 07:43:28 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:43.705 07:43:28 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:43.706 07:43:28 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:43.706 07:43:28 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:43.706 07:43:28 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:43.706 [2024-07-15 07:43:28.286636] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:43.706 [2024-07-15 07:43:28.286695] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555252 ] 00:07:43.706 [2024-07-15 07:43:28.375972] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.706 [2024-07-15 07:43:28.445624] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.005 [2024-07-15 07:43:28.498531] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:44.005 [2024-07-15 07:43:28.536166] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:44.005 A filename is required. 00:07:44.005 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:44.005 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:44.005 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:44.005 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:44.005 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:44.005 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:44.005 00:07:44.005 real 0m0.337s 00:07:44.005 user 0m0.221s 00:07:44.005 sys 0m0.139s 00:07:44.005 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.005 07:43:28 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:44.005 ************************************ 00:07:44.005 END TEST accel_missing_filename 00:07:44.005 ************************************ 00:07:44.005 07:43:28 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.005 07:43:28 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:44.005 07:43:28 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:44.005 07:43:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.005 07:43:28 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.005 ************************************ 00:07:44.005 START TEST accel_compress_verify 00:07:44.005 ************************************ 00:07:44.005 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:44.005 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:44.005 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:44.005 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:44.005 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:44.005 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:44.005 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:44.005 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:44.005 07:43:28 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:44.005 [2024-07-15 07:43:28.700364] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:44.005 [2024-07-15 07:43:28.700429] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555342 ] 00:07:44.278 [2024-07-15 07:43:28.789457] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.278 [2024-07-15 07:43:28.865147] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.278 [2024-07-15 07:43:28.911451] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:44.278 [2024-07-15 07:43:28.949569] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:44.278 00:07:44.278 Compression does not support the verify option, aborting. 00:07:44.278 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:44.278 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:44.278 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:44.278 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:44.278 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:44.278 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:44.278 00:07:44.278 real 0m0.337s 00:07:44.278 user 0m0.227s 00:07:44.278 sys 0m0.135s 00:07:44.278 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.278 07:43:28 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:44.278 ************************************ 00:07:44.278 END TEST accel_compress_verify 00:07:44.278 ************************************ 00:07:44.539 07:43:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.539 07:43:29 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:44.539 07:43:29 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:44.539 07:43:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.539 07:43:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.539 ************************************ 00:07:44.539 START TEST accel_wrong_workload 00:07:44.539 ************************************ 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:44.539 07:43:29 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:44.539 Unsupported workload type: foobar 00:07:44.539 [2024-07-15 07:43:29.112750] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:44.539 accel_perf options: 00:07:44.539 [-h help message] 00:07:44.539 [-q queue depth per core] 00:07:44.539 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:44.539 [-T number of threads per core 00:07:44.539 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:44.539 [-t time in seconds] 00:07:44.539 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:44.539 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:44.539 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:44.539 [-l for compress/decompress workloads, name of uncompressed input file 00:07:44.539 [-S for crc32c workload, use this seed value (default 0) 00:07:44.539 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:44.539 [-f for fill workload, use this BYTE value (default 255) 00:07:44.539 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:44.539 [-y verify result if this switch is on] 00:07:44.539 [-a tasks to allocate per core (default: same value as -q)] 00:07:44.539 Can be used to spread operations across a wider range of memory. 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:44.539 00:07:44.539 real 0m0.042s 00:07:44.539 user 0m0.024s 00:07:44.539 sys 0m0.017s 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.539 07:43:29 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:44.539 ************************************ 00:07:44.539 END TEST accel_wrong_workload 00:07:44.539 ************************************ 00:07:44.539 Error: writing output failed: Broken pipe 00:07:44.540 07:43:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.540 07:43:29 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:44.540 07:43:29 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:44.540 07:43:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.540 07:43:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.540 ************************************ 00:07:44.540 START TEST accel_negative_buffers 00:07:44.540 ************************************ 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:44.540 07:43:29 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:44.540 -x option must be non-negative. 00:07:44.540 [2024-07-15 07:43:29.227049] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:44.540 accel_perf options: 00:07:44.540 [-h help message] 00:07:44.540 [-q queue depth per core] 00:07:44.540 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:44.540 [-T number of threads per core 00:07:44.540 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:44.540 [-t time in seconds] 00:07:44.540 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:44.540 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:44.540 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:44.540 [-l for compress/decompress workloads, name of uncompressed input file 00:07:44.540 [-S for crc32c workload, use this seed value (default 0) 00:07:44.540 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:44.540 [-f for fill workload, use this BYTE value (default 255) 00:07:44.540 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:44.540 [-y verify result if this switch is on] 00:07:44.540 [-a tasks to allocate per core (default: same value as -q)] 00:07:44.540 Can be used to spread operations across a wider range of memory. 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:44.540 00:07:44.540 real 0m0.040s 00:07:44.540 user 0m0.028s 00:07:44.540 sys 0m0.012s 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.540 07:43:29 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:44.540 ************************************ 00:07:44.540 END TEST accel_negative_buffers 00:07:44.540 ************************************ 00:07:44.540 Error: writing output failed: Broken pipe 00:07:44.540 07:43:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.540 07:43:29 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:44.540 07:43:29 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:44.540 07:43:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.540 07:43:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.801 ************************************ 00:07:44.801 START TEST accel_crc32c 00:07:44.801 ************************************ 00:07:44.801 07:43:29 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:44.801 [2024-07-15 07:43:29.340818] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:44.801 [2024-07-15 07:43:29.340876] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555625 ] 00:07:44.801 [2024-07-15 07:43:29.428415] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.801 [2024-07-15 07:43:29.498739] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.801 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:44.802 07:43:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:46.185 07:43:30 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:46.185 00:07:46.185 real 0m1.329s 00:07:46.185 user 0m1.205s 00:07:46.185 sys 0m0.125s 00:07:46.185 07:43:30 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.185 07:43:30 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:46.185 ************************************ 00:07:46.185 END TEST accel_crc32c 00:07:46.185 ************************************ 00:07:46.185 07:43:30 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:46.185 07:43:30 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:46.185 07:43:30 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:46.185 07:43:30 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.185 07:43:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.185 ************************************ 00:07:46.185 START TEST accel_crc32c_C2 00:07:46.185 ************************************ 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.185 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.186 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.186 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.186 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.186 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:46.186 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:46.186 [2024-07-15 07:43:30.745619] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:46.186 [2024-07-15 07:43:30.745686] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1555794 ] 00:07:46.186 [2024-07-15 07:43:30.836366] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.186 [2024-07-15 07:43:30.911744] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.446 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.447 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:46.447 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:46.447 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.447 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:46.447 07:43:30 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.388 00:07:47.388 real 0m1.343s 00:07:47.388 user 0m1.206s 00:07:47.388 sys 0m0.132s 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.388 07:43:32 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:47.388 ************************************ 00:07:47.388 END TEST accel_crc32c_C2 00:07:47.388 ************************************ 00:07:47.388 07:43:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.388 07:43:32 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:47.388 07:43:32 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:47.388 07:43:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.388 07:43:32 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.388 ************************************ 00:07:47.388 START TEST accel_copy 00:07:47.388 ************************************ 00:07:47.388 07:43:32 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:47.388 07:43:32 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:47.649 [2024-07-15 07:43:32.164233] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:47.649 [2024-07-15 07:43:32.164297] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556000 ] 00:07:47.649 [2024-07-15 07:43:32.256413] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.649 [2024-07-15 07:43:32.332519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.649 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:47.650 07:43:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:49.032 07:43:33 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.032 00:07:49.032 real 0m1.346s 00:07:49.032 user 0m1.215s 00:07:49.032 sys 0m0.127s 00:07:49.032 07:43:33 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:49.032 07:43:33 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:49.032 ************************************ 00:07:49.032 END TEST accel_copy 00:07:49.032 ************************************ 00:07:49.032 07:43:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:49.032 07:43:33 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:49.032 07:43:33 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:49.032 07:43:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:49.032 07:43:33 accel -- common/autotest_common.sh@10 -- # set +x 00:07:49.032 ************************************ 00:07:49.032 START TEST accel_fill 00:07:49.032 ************************************ 00:07:49.033 07:43:33 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:49.033 07:43:33 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:49.033 [2024-07-15 07:43:33.584722] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:49.033 [2024-07-15 07:43:33.584784] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556302 ] 00:07:49.033 [2024-07-15 07:43:33.674480] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.033 [2024-07-15 07:43:33.748319] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.292 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:49.293 07:43:33 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:50.231 07:43:34 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:50.231 00:07:50.231 real 0m1.332s 00:07:50.231 user 0m1.203s 00:07:50.231 sys 0m0.131s 00:07:50.231 07:43:34 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:50.231 07:43:34 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:50.231 ************************************ 00:07:50.231 END TEST accel_fill 00:07:50.231 ************************************ 00:07:50.231 07:43:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:50.231 07:43:34 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:50.231 07:43:34 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:50.231 07:43:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:50.231 07:43:34 accel -- common/autotest_common.sh@10 -- # set +x 00:07:50.231 ************************************ 00:07:50.231 START TEST accel_copy_crc32c 00:07:50.231 ************************************ 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:50.231 07:43:34 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:50.491 [2024-07-15 07:43:34.992979] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:50.491 [2024-07-15 07:43:34.993042] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556623 ] 00:07:50.491 [2024-07-15 07:43:35.080592] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.491 [2024-07-15 07:43:35.151975] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:50.491 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:50.492 07:43:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.872 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.872 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.872 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.872 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.872 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.872 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:51.873 00:07:51.873 real 0m1.324s 00:07:51.873 user 0m1.205s 00:07:51.873 sys 0m0.123s 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:51.873 07:43:36 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:51.873 ************************************ 00:07:51.873 END TEST accel_copy_crc32c 00:07:51.873 ************************************ 00:07:51.873 07:43:36 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:51.873 07:43:36 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:51.873 07:43:36 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:51.873 07:43:36 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:51.873 07:43:36 accel -- common/autotest_common.sh@10 -- # set +x 00:07:51.873 ************************************ 00:07:51.873 START TEST accel_copy_crc32c_C2 00:07:51.873 ************************************ 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:51.873 [2024-07-15 07:43:36.396148] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:51.873 [2024-07-15 07:43:36.396213] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556892 ] 00:07:51.873 [2024-07-15 07:43:36.487640] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.873 [2024-07-15 07:43:36.562422] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:51.873 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:51.874 07:43:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:53.254 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:53.254 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.254 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:53.254 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:53.254 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:53.254 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.254 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:53.254 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:53.255 00:07:53.255 real 0m1.338s 00:07:53.255 user 0m1.216s 00:07:53.255 sys 0m0.124s 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:53.255 07:43:37 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:53.255 ************************************ 00:07:53.255 END TEST accel_copy_crc32c_C2 00:07:53.255 ************************************ 00:07:53.255 07:43:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:53.255 07:43:37 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:53.255 07:43:37 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:53.255 07:43:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:53.255 07:43:37 accel -- common/autotest_common.sh@10 -- # set +x 00:07:53.255 ************************************ 00:07:53.255 START TEST accel_dualcast 00:07:53.255 ************************************ 00:07:53.255 07:43:37 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:53.255 07:43:37 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:53.255 [2024-07-15 07:43:37.806519] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:53.255 [2024-07-15 07:43:37.806588] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1557010 ] 00:07:53.255 [2024-07-15 07:43:37.897065] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.255 [2024-07-15 07:43:37.969768] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:53.515 07:43:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:54.454 07:43:39 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:54.454 00:07:54.454 real 0m1.345s 00:07:54.454 user 0m1.203s 00:07:54.454 sys 0m0.135s 00:07:54.454 07:43:39 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.454 07:43:39 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:54.454 ************************************ 00:07:54.454 END TEST accel_dualcast 00:07:54.454 ************************************ 00:07:54.454 07:43:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:54.454 07:43:39 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:54.454 07:43:39 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:54.454 07:43:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.454 07:43:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:54.454 ************************************ 00:07:54.454 START TEST accel_compare 00:07:54.454 ************************************ 00:07:54.454 07:43:39 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:54.454 07:43:39 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:54.714 [2024-07-15 07:43:39.227554] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:54.714 [2024-07-15 07:43:39.227618] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1557304 ] 00:07:54.714 [2024-07-15 07:43:39.317561] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.714 [2024-07-15 07:43:39.382759] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.714 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.715 07:43:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:56.099 07:43:40 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:56.099 00:07:56.099 real 0m1.326s 00:07:56.099 user 0m1.201s 00:07:56.099 sys 0m0.126s 00:07:56.099 07:43:40 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:56.099 07:43:40 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:56.099 ************************************ 00:07:56.099 END TEST accel_compare 00:07:56.099 ************************************ 00:07:56.099 07:43:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:56.099 07:43:40 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:56.099 07:43:40 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:56.099 07:43:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:56.100 07:43:40 accel -- common/autotest_common.sh@10 -- # set +x 00:07:56.100 ************************************ 00:07:56.100 START TEST accel_xor 00:07:56.100 ************************************ 00:07:56.100 07:43:40 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:56.100 [2024-07-15 07:43:40.628732] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:56.100 [2024-07-15 07:43:40.628827] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1557622 ] 00:07:56.100 [2024-07-15 07:43:40.725134] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.100 [2024-07-15 07:43:40.791507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:56.100 07:43:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:57.483 07:43:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:57.483 00:07:57.483 real 0m1.338s 00:07:57.483 user 0m1.201s 00:07:57.483 sys 0m0.133s 00:07:57.483 07:43:41 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:57.483 07:43:41 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:57.483 ************************************ 00:07:57.483 END TEST accel_xor 00:07:57.483 ************************************ 00:07:57.483 07:43:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:57.483 07:43:41 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:57.483 07:43:41 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:57.483 07:43:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.483 07:43:41 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.483 ************************************ 00:07:57.483 START TEST accel_xor 00:07:57.483 ************************************ 00:07:57.483 07:43:42 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:57.483 [2024-07-15 07:43:42.038451] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:57.483 [2024-07-15 07:43:42.038512] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1557945 ] 00:07:57.483 [2024-07-15 07:43:42.127252] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.483 [2024-07-15 07:43:42.191635] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.483 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.484 07:43:42 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.743 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.744 07:43:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.681 07:43:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:58.682 07:43:43 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:58.682 00:07:58.682 real 0m1.319s 00:07:58.682 user 0m1.192s 00:07:58.682 sys 0m0.132s 00:07:58.682 07:43:43 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:58.682 07:43:43 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:58.682 ************************************ 00:07:58.682 END TEST accel_xor 00:07:58.682 ************************************ 00:07:58.682 07:43:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:58.682 07:43:43 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:58.682 07:43:43 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:58.682 07:43:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:58.682 07:43:43 accel -- common/autotest_common.sh@10 -- # set +x 00:07:58.682 ************************************ 00:07:58.682 START TEST accel_dif_verify 00:07:58.682 ************************************ 00:07:58.682 07:43:43 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:58.682 07:43:43 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:58.682 [2024-07-15 07:43:43.433450] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:07:58.682 [2024-07-15 07:43:43.433514] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1558109 ] 00:07:58.941 [2024-07-15 07:43:43.524943] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.941 [2024-07-15 07:43:43.590487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.941 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.942 07:43:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:00.324 07:43:44 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:00.324 00:08:00.324 real 0m1.331s 00:08:00.324 user 0m1.207s 00:08:00.324 sys 0m0.124s 00:08:00.324 07:43:44 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.324 07:43:44 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:00.324 ************************************ 00:08:00.324 END TEST accel_dif_verify 00:08:00.324 ************************************ 00:08:00.324 07:43:44 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:00.324 07:43:44 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:00.324 07:43:44 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:00.324 07:43:44 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.324 07:43:44 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.324 ************************************ 00:08:00.324 START TEST accel_dif_generate 00:08:00.324 ************************************ 00:08:00.324 07:43:44 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:00.324 07:43:44 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:00.324 [2024-07-15 07:43:44.839383] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:00.324 [2024-07-15 07:43:44.839451] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1558311 ] 00:08:00.324 [2024-07-15 07:43:44.926481] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.324 [2024-07-15 07:43:44.993474] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.324 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.325 07:43:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:01.709 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:01.710 07:43:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:01.710 07:43:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:01.710 00:08:01.710 real 0m1.328s 00:08:01.710 user 0m1.204s 00:08:01.710 sys 0m0.126s 00:08:01.710 07:43:46 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:01.710 07:43:46 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:01.710 ************************************ 00:08:01.710 END TEST accel_dif_generate 00:08:01.710 ************************************ 00:08:01.710 07:43:46 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:01.710 07:43:46 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:01.710 07:43:46 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:01.710 07:43:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.710 07:43:46 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.710 ************************************ 00:08:01.710 START TEST accel_dif_generate_copy 00:08:01.710 ************************************ 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:01.710 [2024-07-15 07:43:46.242788] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:01.710 [2024-07-15 07:43:46.242846] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1558620 ] 00:08:01.710 [2024-07-15 07:43:46.334296] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.710 [2024-07-15 07:43:46.399094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.710 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.711 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:01.711 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:01.711 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:01.711 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:01.711 07:43:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:03.096 00:08:03.096 real 0m1.331s 00:08:03.096 user 0m1.205s 00:08:03.096 sys 0m0.125s 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.096 07:43:47 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:03.096 ************************************ 00:08:03.096 END TEST accel_dif_generate_copy 00:08:03.096 ************************************ 00:08:03.096 07:43:47 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.096 07:43:47 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:03.096 07:43:47 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:03.096 07:43:47 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:03.096 07:43:47 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.096 07:43:47 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.096 ************************************ 00:08:03.096 START TEST accel_comp 00:08:03.096 ************************************ 00:08:03.096 07:43:47 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:03.096 07:43:47 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:03.096 07:43:47 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:03.096 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.096 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.096 07:43:47 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:03.096 07:43:47 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:03.097 [2024-07-15 07:43:47.643635] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:03.097 [2024-07-15 07:43:47.643682] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1558935 ] 00:08:03.097 [2024-07-15 07:43:47.732618] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.097 [2024-07-15 07:43:47.796517] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.097 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.358 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.358 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.358 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.358 07:43:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.358 07:43:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.358 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.358 07:43:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:04.369 07:43:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:04.369 00:08:04.369 real 0m1.320s 00:08:04.369 user 0m1.211s 00:08:04.369 sys 0m0.116s 00:08:04.369 07:43:48 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:04.369 07:43:48 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:04.369 ************************************ 00:08:04.369 END TEST accel_comp 00:08:04.369 ************************************ 00:08:04.369 07:43:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:04.369 07:43:48 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:04.369 07:43:48 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:04.369 07:43:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:04.369 07:43:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:04.369 ************************************ 00:08:04.369 START TEST accel_decomp 00:08:04.369 ************************************ 00:08:04.369 07:43:49 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:04.369 07:43:49 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:04.369 [2024-07-15 07:43:49.042038] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:04.369 [2024-07-15 07:43:49.042095] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1559191 ] 00:08:04.629 [2024-07-15 07:43:49.134085] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.629 [2024-07-15 07:43:49.209092] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:04.629 07:43:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:04.630 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:04.630 07:43:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:06.011 07:43:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:06.011 00:08:06.011 real 0m1.336s 00:08:06.011 user 0m1.203s 00:08:06.011 sys 0m0.139s 00:08:06.011 07:43:50 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.011 07:43:50 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:06.011 ************************************ 00:08:06.011 END TEST accel_decomp 00:08:06.011 ************************************ 00:08:06.011 07:43:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:06.011 07:43:50 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:06.011 07:43:50 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:06.011 07:43:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:06.011 07:43:50 accel -- common/autotest_common.sh@10 -- # set +x 00:08:06.011 ************************************ 00:08:06.011 START TEST accel_decomp_full 00:08:06.011 ************************************ 00:08:06.011 07:43:50 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:06.011 [2024-07-15 07:43:50.452140] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:06.011 [2024-07-15 07:43:50.452206] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1559313 ] 00:08:06.011 [2024-07-15 07:43:50.543055] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.011 [2024-07-15 07:43:50.608301] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.011 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.012 07:43:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:07.395 07:43:51 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:07.395 00:08:07.395 real 0m1.344s 00:08:07.395 user 0m1.211s 00:08:07.395 sys 0m0.132s 00:08:07.395 07:43:51 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:07.395 07:43:51 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:07.395 ************************************ 00:08:07.395 END TEST accel_decomp_full 00:08:07.395 ************************************ 00:08:07.395 07:43:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:07.395 07:43:51 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:07.395 07:43:51 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:07.395 07:43:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.395 07:43:51 accel -- common/autotest_common.sh@10 -- # set +x 00:08:07.395 ************************************ 00:08:07.395 START TEST accel_decomp_mcore 00:08:07.395 ************************************ 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:07.395 07:43:51 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:07.395 [2024-07-15 07:43:51.870317] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:07.396 [2024-07-15 07:43:51.870375] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1559614 ] 00:08:07.396 [2024-07-15 07:43:51.962204] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:07.396 [2024-07-15 07:43:52.030222] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:07.396 [2024-07-15 07:43:52.030373] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:07.396 [2024-07-15 07:43:52.030514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.396 [2024-07-15 07:43:52.030515] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.396 07:43:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:08.778 00:08:08.778 real 0m1.347s 00:08:08.778 user 0m4.488s 00:08:08.778 sys 0m0.143s 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:08.778 07:43:53 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:08.778 ************************************ 00:08:08.778 END TEST accel_decomp_mcore 00:08:08.778 ************************************ 00:08:08.778 07:43:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:08.778 07:43:53 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.778 07:43:53 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:08.778 07:43:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.778 07:43:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:08.778 ************************************ 00:08:08.778 START TEST accel_decomp_full_mcore 00:08:08.778 ************************************ 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:08.778 [2024-07-15 07:43:53.291164] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:08.778 [2024-07-15 07:43:53.291223] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1559938 ] 00:08:08.778 [2024-07-15 07:43:53.379634] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:08.778 [2024-07-15 07:43:53.452216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:08.778 [2024-07-15 07:43:53.452364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:08.778 [2024-07-15 07:43:53.452507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.778 [2024-07-15 07:43:53.452507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.778 07:43:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:10.157 00:08:10.157 real 0m1.370s 00:08:10.157 user 0m4.585s 00:08:10.157 sys 0m0.140s 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.157 07:43:54 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:10.157 ************************************ 00:08:10.157 END TEST accel_decomp_full_mcore 00:08:10.157 ************************************ 00:08:10.157 07:43:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:10.157 07:43:54 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:10.157 07:43:54 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:10.157 07:43:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.157 07:43:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:10.157 ************************************ 00:08:10.157 START TEST accel_decomp_mthread 00:08:10.157 ************************************ 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:10.157 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:10.157 [2024-07-15 07:43:54.735908] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:10.157 [2024-07-15 07:43:54.735970] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560261 ] 00:08:10.157 [2024-07-15 07:43:54.825304] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.157 [2024-07-15 07:43:54.891338] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.416 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.417 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.417 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.417 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.417 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.417 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.417 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.417 07:43:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.353 00:08:11.353 real 0m1.330s 00:08:11.353 user 0m1.212s 00:08:11.353 sys 0m0.124s 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.353 07:43:56 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:11.353 ************************************ 00:08:11.353 END TEST accel_decomp_mthread 00:08:11.353 ************************************ 00:08:11.353 07:43:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:11.353 07:43:56 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.354 07:43:56 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:11.354 07:43:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.354 07:43:56 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.354 ************************************ 00:08:11.354 START TEST accel_decomp_full_mthread 00:08:11.354 ************************************ 00:08:11.354 07:43:56 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.354 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:11.354 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:11.614 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:11.614 [2024-07-15 07:43:56.149992] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:11.614 [2024-07-15 07:43:56.150120] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560388 ] 00:08:11.614 [2024-07-15 07:43:56.290982] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.614 [2024-07-15 07:43:56.366044] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.875 07:43:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:12.816 07:43:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:12.816 00:08:12.816 real 0m1.429s 00:08:12.816 user 0m1.250s 00:08:12.817 sys 0m0.180s 00:08:12.817 07:43:57 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:12.817 07:43:57 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:12.817 ************************************ 00:08:12.817 END TEST accel_decomp_full_mthread 00:08:12.817 ************************************ 00:08:12.817 07:43:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:12.817 07:43:57 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:12.817 07:43:57 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:12.817 07:43:57 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:12.817 07:43:57 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:12.817 07:43:57 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1560629 00:08:12.817 07:43:57 accel -- accel/accel.sh@63 -- # waitforlisten 1560629 00:08:13.076 07:43:57 accel -- common/autotest_common.sh@829 -- # '[' -z 1560629 ']' 00:08:13.076 07:43:57 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:13.076 07:43:57 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:13.076 07:43:57 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:13.076 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:13.076 07:43:57 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:13.076 07:43:57 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:13.076 07:43:57 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.076 07:43:57 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:13.076 07:43:57 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.076 07:43:57 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.076 07:43:57 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.076 07:43:57 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.076 07:43:57 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:13.076 07:43:57 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:13.076 07:43:57 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:13.076 07:43:57 accel -- accel/accel.sh@41 -- # jq -r . 00:08:13.076 [2024-07-15 07:43:57.644069] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:13.076 [2024-07-15 07:43:57.644124] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560629 ] 00:08:13.076 [2024-07-15 07:43:57.731075] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.076 [2024-07-15 07:43:57.795251] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.645 [2024-07-15 07:43:58.197860] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:13.906 07:43:58 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:13.906 07:43:58 accel -- common/autotest_common.sh@862 -- # return 0 00:08:13.906 07:43:58 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:13.906 07:43:58 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:13.906 07:43:58 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:13.906 07:43:58 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:13.906 07:43:58 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:13.906 07:43:58 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:13.906 07:43:58 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:13.906 07:43:58 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:13.906 07:43:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.906 07:43:58 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:13.906 07:43:58 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:13.906 "method": "compressdev_scan_accel_module", 00:08:13.906 07:43:58 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:13.906 07:43:58 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:13.906 07:43:58 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:13.906 07:43:58 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:13.906 07:43:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.906 07:43:58 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.166 07:43:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.166 07:43:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.166 07:43:58 accel -- accel/accel.sh@75 -- # killprocess 1560629 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@948 -- # '[' -z 1560629 ']' 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@952 -- # kill -0 1560629 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@953 -- # uname 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1560629 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1560629' 00:08:14.166 killing process with pid 1560629 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@967 -- # kill 1560629 00:08:14.166 07:43:58 accel -- common/autotest_common.sh@972 -- # wait 1560629 00:08:14.426 07:43:58 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:14.426 07:43:58 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:14.426 07:43:58 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:14.426 07:43:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.426 07:43:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.426 ************************************ 00:08:14.426 START TEST accel_cdev_comp 00:08:14.426 ************************************ 00:08:14.426 07:43:58 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:14.426 07:43:58 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:14.426 [2024-07-15 07:43:59.027285] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:14.426 [2024-07-15 07:43:59.027334] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560935 ] 00:08:14.426 [2024-07-15 07:43:59.114079] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.686 [2024-07-15 07:43:59.183512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.946 [2024-07-15 07:43:59.584604] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:14.946 [2024-07-15 07:43:59.586364] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14c4640 PMD being used: compress_qat 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 [2024-07-15 07:43:59.589431] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16c93f0 PMD being used: compress_qat 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.946 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:14.947 07:43:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:16.335 07:44:00 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:16.335 00:08:16.335 real 0m1.694s 00:08:16.335 user 0m1.395s 00:08:16.335 sys 0m0.292s 00:08:16.335 07:44:00 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:16.335 07:44:00 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:16.335 ************************************ 00:08:16.335 END TEST accel_cdev_comp 00:08:16.335 ************************************ 00:08:16.335 07:44:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:16.335 07:44:00 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:16.335 07:44:00 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:16.335 07:44:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.335 07:44:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:16.335 ************************************ 00:08:16.335 START TEST accel_cdev_decomp 00:08:16.335 ************************************ 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:16.335 07:44:00 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:16.335 [2024-07-15 07:44:00.790780] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:16.335 [2024-07-15 07:44:00.790842] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561259 ] 00:08:16.335 [2024-07-15 07:44:00.878780] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.335 [2024-07-15 07:44:00.943917] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.905 [2024-07-15 07:44:01.357466] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:16.905 [2024-07-15 07:44:01.359213] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xef8640 PMD being used: compress_qat 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 [2024-07-15 07:44:01.362379] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10fd3f0 PMD being used: compress_qat 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:16.905 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:16.906 07:44:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.843 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:17.844 00:08:17.844 real 0m1.701s 00:08:17.844 user 0m0.008s 00:08:17.844 sys 0m0.001s 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:17.844 07:44:02 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:17.844 ************************************ 00:08:17.844 END TEST accel_cdev_decomp 00:08:17.844 ************************************ 00:08:17.844 07:44:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:17.844 07:44:02 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.844 07:44:02 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:17.844 07:44:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.844 07:44:02 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.844 ************************************ 00:08:17.844 START TEST accel_cdev_decomp_full 00:08:17.844 ************************************ 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:17.844 07:44:02 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:17.844 [2024-07-15 07:44:02.562895] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:17.844 [2024-07-15 07:44:02.562960] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561583 ] 00:08:18.104 [2024-07-15 07:44:02.648160] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.104 [2024-07-15 07:44:02.713491] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.364 [2024-07-15 07:44:03.113091] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:18.364 [2024-07-15 07:44:03.114844] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16d4640 PMD being used: compress_qat 00:08:18.364 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.364 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 [2024-07-15 07:44:03.117145] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16d7970 PMD being used: compress_qat 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.365 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.625 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.626 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.626 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.626 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:18.626 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:18.626 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:18.626 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:18.626 07:44:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.575 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:19.576 00:08:19.576 real 0m1.685s 00:08:19.576 user 0m0.008s 00:08:19.576 sys 0m0.001s 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:19.576 07:44:04 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:19.576 ************************************ 00:08:19.576 END TEST accel_cdev_decomp_full 00:08:19.576 ************************************ 00:08:19.576 07:44:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:19.576 07:44:04 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:19.576 07:44:04 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:19.576 07:44:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.576 07:44:04 accel -- common/autotest_common.sh@10 -- # set +x 00:08:19.576 ************************************ 00:08:19.576 START TEST accel_cdev_decomp_mcore 00:08:19.576 ************************************ 00:08:19.576 07:44:04 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:19.576 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:19.576 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:19.576 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:19.576 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:19.577 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:19.577 [2024-07-15 07:44:04.320749] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:19.577 [2024-07-15 07:44:04.320807] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561913 ] 00:08:19.837 [2024-07-15 07:44:04.408867] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:19.837 [2024-07-15 07:44:04.486614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:19.837 [2024-07-15 07:44:04.486764] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:19.837 [2024-07-15 07:44:04.486828] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.837 [2024-07-15 07:44:04.486828] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:20.406 [2024-07-15 07:44:04.885815] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:20.406 [2024-07-15 07:44:04.887552] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2393ce0 PMD being used: compress_qat 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 [2024-07-15 07:44:04.891720] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff88819b8b0 PMD being used: compress_qat 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:20.406 [2024-07-15 07:44:04.893092] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24b9760 PMD being used: compress_qat 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 [2024-07-15 07:44:04.898927] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff88019b8b0 PMD being used: compress_qat 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:20.406 [2024-07-15 07:44:04.899067] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff87819b8b0 PMD being used: compress_qat 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:20.406 07:44:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:21.346 00:08:21.346 real 0m1.722s 00:08:21.346 user 0m5.800s 00:08:21.346 sys 0m0.306s 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:21.346 07:44:06 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:21.346 ************************************ 00:08:21.346 END TEST accel_cdev_decomp_mcore 00:08:21.346 ************************************ 00:08:21.346 07:44:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:21.346 07:44:06 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:21.346 07:44:06 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:21.346 07:44:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:21.346 07:44:06 accel -- common/autotest_common.sh@10 -- # set +x 00:08:21.346 ************************************ 00:08:21.346 START TEST accel_cdev_decomp_full_mcore 00:08:21.346 ************************************ 00:08:21.346 07:44:06 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:21.346 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:21.346 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:21.346 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.346 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:21.347 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:21.608 [2024-07-15 07:44:06.120827] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:21.608 [2024-07-15 07:44:06.120886] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1562235 ] 00:08:21.608 [2024-07-15 07:44:06.210095] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:21.608 [2024-07-15 07:44:06.280373] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:21.608 [2024-07-15 07:44:06.280526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:21.608 [2024-07-15 07:44:06.280673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.608 [2024-07-15 07:44:06.280674] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:22.179 [2024-07-15 07:44:06.677870] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:22.179 [2024-07-15 07:44:06.679639] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1694ce0 PMD being used: compress_qat 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.179 [2024-07-15 07:44:06.683088] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fd0f419b8b0 PMD being used: compress_qat 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.179 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 [2024-07-15 07:44:06.684377] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x169a1e0 PMD being used: compress_qat 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 [2024-07-15 07:44:06.689919] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fd0ec19b8b0 PMD being used: compress_qat 00:08:22.180 [2024-07-15 07:44:06.689998] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fd0e419b8b0 PMD being used: compress_qat 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:22.180 07:44:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:23.162 00:08:23.162 real 0m1.722s 00:08:23.162 user 0m5.825s 00:08:23.162 sys 0m0.295s 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:23.162 07:44:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:23.162 ************************************ 00:08:23.162 END TEST accel_cdev_decomp_full_mcore 00:08:23.162 ************************************ 00:08:23.162 07:44:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:23.162 07:44:07 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:23.162 07:44:07 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:23.162 07:44:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.162 07:44:07 accel -- common/autotest_common.sh@10 -- # set +x 00:08:23.162 ************************************ 00:08:23.162 START TEST accel_cdev_decomp_mthread 00:08:23.162 ************************************ 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:23.162 07:44:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:23.162 [2024-07-15 07:44:07.917392] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:23.162 [2024-07-15 07:44:07.917458] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1562563 ] 00:08:23.422 [2024-07-15 07:44:08.005766] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.422 [2024-07-15 07:44:08.070890] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.993 [2024-07-15 07:44:08.469719] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:23.993 [2024-07-15 07:44:08.471451] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2776640 PMD being used: compress_qat 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 [2024-07-15 07:44:08.474805] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x277b840 PMD being used: compress_qat 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 [2024-07-15 07:44:08.476518] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x289e320 PMD being used: compress_qat 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:23.993 07:44:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:24.935 00:08:24.935 real 0m1.695s 00:08:24.935 user 0m1.400s 00:08:24.935 sys 0m0.294s 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:24.935 07:44:09 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:24.935 ************************************ 00:08:24.935 END TEST accel_cdev_decomp_mthread 00:08:24.935 ************************************ 00:08:24.935 07:44:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:24.935 07:44:09 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:24.935 07:44:09 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:24.935 07:44:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:24.935 07:44:09 accel -- common/autotest_common.sh@10 -- # set +x 00:08:24.935 ************************************ 00:08:24.935 START TEST accel_cdev_decomp_full_mthread 00:08:24.935 ************************************ 00:08:24.935 07:44:09 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:24.935 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:24.935 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:24.935 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:24.935 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:24.936 07:44:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:24.936 [2024-07-15 07:44:09.687743] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:24.936 [2024-07-15 07:44:09.687803] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1562887 ] 00:08:25.196 [2024-07-15 07:44:09.775840] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.196 [2024-07-15 07:44:09.852748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.768 [2024-07-15 07:44:10.260168] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:25.768 [2024-07-15 07:44:10.261918] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7d2640 PMD being used: compress_qat 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 [2024-07-15 07:44:10.264441] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7d26e0 PMD being used: compress_qat 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:25.768 [2024-07-15 07:44:10.266319] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x9d72d0 PMD being used: compress_qat 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:25.768 07:44:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:26.710 00:08:26.710 real 0m1.718s 00:08:26.710 user 0m1.406s 00:08:26.710 sys 0m0.313s 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:26.710 07:44:11 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:26.710 ************************************ 00:08:26.710 END TEST accel_cdev_decomp_full_mthread 00:08:26.710 ************************************ 00:08:26.710 07:44:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:26.710 07:44:11 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:26.710 07:44:11 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:26.710 07:44:11 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:26.710 07:44:11 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:26.710 07:44:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:26.710 07:44:11 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:26.710 07:44:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:26.710 07:44:11 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:26.710 07:44:11 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:26.710 07:44:11 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:26.710 07:44:11 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:26.710 07:44:11 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:26.710 07:44:11 accel -- accel/accel.sh@41 -- # jq -r . 00:08:26.710 ************************************ 00:08:26.710 START TEST accel_dif_functional_tests 00:08:26.710 ************************************ 00:08:26.710 07:44:11 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:26.970 [2024-07-15 07:44:11.503222] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:26.970 [2024-07-15 07:44:11.503273] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1563216 ] 00:08:26.970 [2024-07-15 07:44:11.593641] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:26.970 [2024-07-15 07:44:11.672716] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:26.970 [2024-07-15 07:44:11.672855] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:26.970 [2024-07-15 07:44:11.672945] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.231 00:08:27.231 00:08:27.231 CUnit - A unit testing framework for C - Version 2.1-3 00:08:27.231 http://cunit.sourceforge.net/ 00:08:27.231 00:08:27.231 00:08:27.231 Suite: accel_dif 00:08:27.231 Test: verify: DIF generated, GUARD check ...passed 00:08:27.231 Test: verify: DIF generated, APPTAG check ...passed 00:08:27.231 Test: verify: DIF generated, REFTAG check ...passed 00:08:27.231 Test: verify: DIF not generated, GUARD check ...[2024-07-15 07:44:11.740507] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:27.231 passed 00:08:27.231 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 07:44:11.740555] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:27.231 passed 00:08:27.231 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 07:44:11.740578] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:27.231 passed 00:08:27.231 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:27.231 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 07:44:11.740627] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:27.231 passed 00:08:27.231 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:27.231 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:27.231 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:27.231 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 07:44:11.740744] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:27.231 passed 00:08:27.231 Test: verify copy: DIF generated, GUARD check ...passed 00:08:27.231 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:27.231 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:27.231 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 07:44:11.740868] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:27.231 passed 00:08:27.231 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 07:44:11.740893] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:27.231 passed 00:08:27.231 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 07:44:11.740915] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:27.231 passed 00:08:27.231 Test: generate copy: DIF generated, GUARD check ...passed 00:08:27.231 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:27.231 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:27.231 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:27.232 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:27.232 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:27.232 Test: generate copy: iovecs-len validate ...[2024-07-15 07:44:11.741109] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:27.232 passed 00:08:27.232 Test: generate copy: buffer alignment validate ...passed 00:08:27.232 00:08:27.232 Run Summary: Type Total Ran Passed Failed Inactive 00:08:27.232 suites 1 1 n/a 0 0 00:08:27.232 tests 26 26 26 0 0 00:08:27.232 asserts 115 115 115 0 n/a 00:08:27.232 00:08:27.232 Elapsed time = 0.002 seconds 00:08:27.232 00:08:27.232 real 0m0.406s 00:08:27.232 user 0m0.519s 00:08:27.232 sys 0m0.162s 00:08:27.232 07:44:11 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:27.232 07:44:11 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:27.232 ************************************ 00:08:27.232 END TEST accel_dif_functional_tests 00:08:27.232 ************************************ 00:08:27.232 07:44:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:27.232 00:08:27.232 real 0m45.096s 00:08:27.232 user 0m54.363s 00:08:27.232 sys 0m7.761s 00:08:27.232 07:44:11 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:27.232 07:44:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:27.232 ************************************ 00:08:27.232 END TEST accel 00:08:27.232 ************************************ 00:08:27.232 07:44:11 -- common/autotest_common.sh@1142 -- # return 0 00:08:27.232 07:44:11 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:27.232 07:44:11 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:27.232 07:44:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.232 07:44:11 -- common/autotest_common.sh@10 -- # set +x 00:08:27.232 ************************************ 00:08:27.232 START TEST accel_rpc 00:08:27.232 ************************************ 00:08:27.232 07:44:11 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:27.493 * Looking for test storage... 00:08:27.493 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:27.493 07:44:12 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:27.493 07:44:12 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1563284 00:08:27.493 07:44:12 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1563284 00:08:27.493 07:44:12 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:27.493 07:44:12 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 1563284 ']' 00:08:27.493 07:44:12 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:27.493 07:44:12 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:27.493 07:44:12 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:27.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:27.493 07:44:12 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:27.493 07:44:12 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:27.493 [2024-07-15 07:44:12.135033] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:27.493 [2024-07-15 07:44:12.135094] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1563284 ] 00:08:27.493 [2024-07-15 07:44:12.226950] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.753 [2024-07-15 07:44:12.295110] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.325 07:44:12 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:28.325 07:44:12 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:28.325 07:44:12 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:28.325 07:44:12 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:28.325 07:44:12 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:28.325 07:44:12 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:28.325 07:44:12 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:28.325 07:44:12 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:28.325 07:44:12 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.325 07:44:12 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:28.325 ************************************ 00:08:28.325 START TEST accel_assign_opcode 00:08:28.325 ************************************ 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:28.325 [2024-07-15 07:44:13.013147] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:28.325 [2024-07-15 07:44:13.025172] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:28.325 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:28.586 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:28.587 07:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:28.587 07:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:28.587 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:28.587 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:28.587 07:44:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:28.587 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:28.587 software 00:08:28.587 00:08:28.587 real 0m0.216s 00:08:28.587 user 0m0.049s 00:08:28.587 sys 0m0.010s 00:08:28.587 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.587 07:44:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:28.587 ************************************ 00:08:28.587 END TEST accel_assign_opcode 00:08:28.587 ************************************ 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:28.587 07:44:13 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1563284 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 1563284 ']' 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 1563284 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1563284 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1563284' 00:08:28.587 killing process with pid 1563284 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@967 -- # kill 1563284 00:08:28.587 07:44:13 accel_rpc -- common/autotest_common.sh@972 -- # wait 1563284 00:08:28.847 00:08:28.847 real 0m1.541s 00:08:28.847 user 0m1.655s 00:08:28.847 sys 0m0.435s 00:08:28.847 07:44:13 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.847 07:44:13 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:28.847 ************************************ 00:08:28.847 END TEST accel_rpc 00:08:28.847 ************************************ 00:08:28.847 07:44:13 -- common/autotest_common.sh@1142 -- # return 0 00:08:28.847 07:44:13 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:28.847 07:44:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:28.847 07:44:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.847 07:44:13 -- common/autotest_common.sh@10 -- # set +x 00:08:28.847 ************************************ 00:08:28.847 START TEST app_cmdline 00:08:28.847 ************************************ 00:08:28.847 07:44:13 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:29.108 * Looking for test storage... 00:08:29.108 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:29.108 07:44:13 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:29.108 07:44:13 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1563660 00:08:29.108 07:44:13 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1563660 00:08:29.108 07:44:13 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 1563660 ']' 00:08:29.108 07:44:13 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:29.108 07:44:13 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:29.108 07:44:13 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:29.108 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:29.108 07:44:13 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:29.108 07:44:13 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:29.108 07:44:13 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:29.108 [2024-07-15 07:44:13.799215] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:29.108 [2024-07-15 07:44:13.799357] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1563660 ] 00:08:29.369 [2024-07-15 07:44:13.944531] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.369 [2024-07-15 07:44:14.019218] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.311 07:44:14 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:30.311 07:44:14 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:30.311 07:44:14 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:30.882 { 00:08:30.882 "version": "SPDK v24.09-pre git sha1 897e912d5", 00:08:30.882 "fields": { 00:08:30.882 "major": 24, 00:08:30.882 "minor": 9, 00:08:30.882 "patch": 0, 00:08:30.882 "suffix": "-pre", 00:08:30.882 "commit": "897e912d5" 00:08:30.882 } 00:08:30.882 } 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:30.882 07:44:15 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:30.882 07:44:15 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:31.142 request: 00:08:31.142 { 00:08:31.142 "method": "env_dpdk_get_mem_stats", 00:08:31.142 "req_id": 1 00:08:31.142 } 00:08:31.142 Got JSON-RPC error response 00:08:31.142 response: 00:08:31.142 { 00:08:31.142 "code": -32601, 00:08:31.142 "message": "Method not found" 00:08:31.142 } 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:31.142 07:44:15 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1563660 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 1563660 ']' 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 1563660 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1563660 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1563660' 00:08:31.142 killing process with pid 1563660 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@967 -- # kill 1563660 00:08:31.142 07:44:15 app_cmdline -- common/autotest_common.sh@972 -- # wait 1563660 00:08:31.403 00:08:31.403 real 0m2.363s 00:08:31.403 user 0m3.349s 00:08:31.403 sys 0m0.528s 00:08:31.403 07:44:15 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.403 07:44:15 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:31.403 ************************************ 00:08:31.403 END TEST app_cmdline 00:08:31.403 ************************************ 00:08:31.403 07:44:15 -- common/autotest_common.sh@1142 -- # return 0 00:08:31.403 07:44:15 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:31.403 07:44:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.403 07:44:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.403 07:44:15 -- common/autotest_common.sh@10 -- # set +x 00:08:31.403 ************************************ 00:08:31.403 START TEST version 00:08:31.403 ************************************ 00:08:31.403 07:44:16 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:31.403 * Looking for test storage... 00:08:31.403 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:31.403 07:44:16 version -- app/version.sh@17 -- # get_header_version major 00:08:31.403 07:44:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:31.403 07:44:16 version -- app/version.sh@14 -- # cut -f2 00:08:31.403 07:44:16 version -- app/version.sh@14 -- # tr -d '"' 00:08:31.403 07:44:16 version -- app/version.sh@17 -- # major=24 00:08:31.403 07:44:16 version -- app/version.sh@18 -- # get_header_version minor 00:08:31.404 07:44:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:31.404 07:44:16 version -- app/version.sh@14 -- # tr -d '"' 00:08:31.404 07:44:16 version -- app/version.sh@14 -- # cut -f2 00:08:31.404 07:44:16 version -- app/version.sh@18 -- # minor=9 00:08:31.404 07:44:16 version -- app/version.sh@19 -- # get_header_version patch 00:08:31.404 07:44:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:31.404 07:44:16 version -- app/version.sh@14 -- # cut -f2 00:08:31.404 07:44:16 version -- app/version.sh@14 -- # tr -d '"' 00:08:31.404 07:44:16 version -- app/version.sh@19 -- # patch=0 00:08:31.404 07:44:16 version -- app/version.sh@20 -- # get_header_version suffix 00:08:31.404 07:44:16 version -- app/version.sh@14 -- # tr -d '"' 00:08:31.404 07:44:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:31.404 07:44:16 version -- app/version.sh@14 -- # cut -f2 00:08:31.404 07:44:16 version -- app/version.sh@20 -- # suffix=-pre 00:08:31.404 07:44:16 version -- app/version.sh@22 -- # version=24.9 00:08:31.404 07:44:16 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:31.404 07:44:16 version -- app/version.sh@28 -- # version=24.9rc0 00:08:31.404 07:44:16 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:31.404 07:44:16 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:31.665 07:44:16 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:31.665 07:44:16 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:31.665 00:08:31.665 real 0m0.172s 00:08:31.665 user 0m0.093s 00:08:31.665 sys 0m0.115s 00:08:31.665 07:44:16 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.665 07:44:16 version -- common/autotest_common.sh@10 -- # set +x 00:08:31.665 ************************************ 00:08:31.665 END TEST version 00:08:31.665 ************************************ 00:08:31.665 07:44:16 -- common/autotest_common.sh@1142 -- # return 0 00:08:31.665 07:44:16 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:31.665 07:44:16 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:31.665 07:44:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.665 07:44:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.665 07:44:16 -- common/autotest_common.sh@10 -- # set +x 00:08:31.665 ************************************ 00:08:31.665 START TEST blockdev_general 00:08:31.665 ************************************ 00:08:31.665 07:44:16 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:31.665 * Looking for test storage... 00:08:31.665 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:31.665 07:44:16 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1564223 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1564223 00:08:31.665 07:44:16 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:31.665 07:44:16 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 1564223 ']' 00:08:31.665 07:44:16 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:31.665 07:44:16 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:31.665 07:44:16 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:31.665 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:31.665 07:44:16 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:31.665 07:44:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:31.926 [2024-07-15 07:44:16.459698] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:31.926 [2024-07-15 07:44:16.459785] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1564223 ] 00:08:31.926 [2024-07-15 07:44:16.554724] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.926 [2024-07-15 07:44:16.629905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.497 07:44:17 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:32.497 07:44:17 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:08:32.497 07:44:17 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:32.497 07:44:17 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:32.497 07:44:17 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:32.497 07:44:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:32.497 07:44:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:32.758 [2024-07-15 07:44:17.418463] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:32.758 [2024-07-15 07:44:17.418502] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:32.758 00:08:32.758 [2024-07-15 07:44:17.426457] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:32.758 [2024-07-15 07:44:17.426472] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:32.758 00:08:32.758 Malloc0 00:08:32.758 Malloc1 00:08:32.758 Malloc2 00:08:32.758 Malloc3 00:08:32.758 Malloc4 00:08:32.758 Malloc5 00:08:32.758 Malloc6 00:08:32.758 Malloc7 00:08:33.018 Malloc8 00:08:33.018 Malloc9 00:08:33.018 [2024-07-15 07:44:17.534929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:33.018 [2024-07-15 07:44:17.534965] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:33.018 [2024-07-15 07:44:17.534976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240bdd0 00:08:33.018 [2024-07-15 07:44:17.534983] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:33.018 [2024-07-15 07:44:17.536102] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:33.018 [2024-07-15 07:44:17.536121] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:33.018 TestPT 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:33.018 5000+0 records in 00:08:33.018 5000+0 records out 00:08:33.018 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0186954 s, 548 MB/s 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:33.018 AIO0 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.018 07:44:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:33.018 07:44:17 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:33.280 07:44:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.280 07:44:17 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:33.280 07:44:17 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:33.281 07:44:17 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "351068c2-f1e6-428f-8de2-b4eb7bf1de6c"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "351068c2-f1e6-428f-8de2-b4eb7bf1de6c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "b5331515-b439-569f-a399-0991749fcd85"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b5331515-b439-569f-a399-0991749fcd85",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "5e7a6265-14d8-5855-8ee9-35611f90f2f4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "5e7a6265-14d8-5855-8ee9-35611f90f2f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "5f196542-a252-5082-8eb3-2dce50cb2935"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5f196542-a252-5082-8eb3-2dce50cb2935",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "a3937c82-f699-550e-a283-e029799125a5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a3937c82-f699-550e-a283-e029799125a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "20bd3fc4-58a6-5ca6-9aa7-d4af40364bce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "20bd3fc4-58a6-5ca6-9aa7-d4af40364bce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "1832e6f7-1cf0-5de2-8ce0-9b93efbb4144"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1832e6f7-1cf0-5de2-8ce0-9b93efbb4144",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "19aefb44-3611-5292-977a-b48c85b3029a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "19aefb44-3611-5292-977a-b48c85b3029a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "e6e53a67-4be8-5574-bf92-44a1dcfba675"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e6e53a67-4be8-5574-bf92-44a1dcfba675",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "655fbb5b-1213-5a31-bbe6-f52e8d77903b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "655fbb5b-1213-5a31-bbe6-f52e8d77903b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b39264c5-79b2-5b75-bc55-5db42a07ced3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b39264c5-79b2-5b75-bc55-5db42a07ced3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f87f8afa-ebd0-5a3c-ac5a-bfc11141ee81"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f87f8afa-ebd0-5a3c-ac5a-bfc11141ee81",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "619d7da8-259a-435a-bdf8-6d8b9ce22576"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "619d7da8-259a-435a-bdf8-6d8b9ce22576",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "619d7da8-259a-435a-bdf8-6d8b9ce22576",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "52647842-8927-49ee-9770-d386e5104a01",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "963e37a6-61cf-4c09-98ca-4d856c71d152",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "313e6746-4168-4e2f-9133-74edaf3d95ad"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "313e6746-4168-4e2f-9133-74edaf3d95ad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "313e6746-4168-4e2f-9133-74edaf3d95ad",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "abb19b59-9928-4846-925e-9cf4c72a4c56",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "564a8074-7dbd-40a2-9201-6fa42b23db14",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "cbacb353-2df9-4e2f-8139-da8f76ccbe78"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cbacb353-2df9-4e2f-8139-da8f76ccbe78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "cbacb353-2df9-4e2f-8139-da8f76ccbe78",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "965a07a3-0ab5-4d68-ad40-da71a09e4e20",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "79040df3-f4c2-4dd7-bb4d-8c95d25d28f6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "25eee8d3-6854-4ff6-b358-93dfdc5335be"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "25eee8d3-6854-4ff6-b358-93dfdc5335be",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:33.281 07:44:17 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:33.281 07:44:17 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:33.281 07:44:17 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:33.281 07:44:17 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 1564223 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 1564223 ']' 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 1564223 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1564223 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1564223' 00:08:33.281 killing process with pid 1564223 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@967 -- # kill 1564223 00:08:33.281 07:44:17 blockdev_general -- common/autotest_common.sh@972 -- # wait 1564223 00:08:33.541 07:44:18 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:33.541 07:44:18 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:33.541 07:44:18 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:33.541 07:44:18 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.541 07:44:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:33.541 ************************************ 00:08:33.541 START TEST bdev_hello_world 00:08:33.541 ************************************ 00:08:33.541 07:44:18 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:33.541 [2024-07-15 07:44:18.256801] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:33.541 [2024-07-15 07:44:18.256845] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1564447 ] 00:08:33.802 [2024-07-15 07:44:18.342484] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.802 [2024-07-15 07:44:18.409545] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.802 [2024-07-15 07:44:18.543039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:33.802 [2024-07-15 07:44:18.543079] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:33.802 [2024-07-15 07:44:18.543087] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:33.802 [2024-07-15 07:44:18.551043] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:33.802 [2024-07-15 07:44:18.551061] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:34.063 [2024-07-15 07:44:18.559055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:34.063 [2024-07-15 07:44:18.559071] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:34.063 [2024-07-15 07:44:18.620154] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:34.063 [2024-07-15 07:44:18.620191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:34.063 [2024-07-15 07:44:18.620200] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x287b9f0 00:08:34.063 [2024-07-15 07:44:18.620206] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:34.063 [2024-07-15 07:44:18.621355] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:34.063 [2024-07-15 07:44:18.621375] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:34.063 [2024-07-15 07:44:18.754028] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:34.063 [2024-07-15 07:44:18.754071] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:34.063 [2024-07-15 07:44:18.754100] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:34.063 [2024-07-15 07:44:18.754145] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:34.063 [2024-07-15 07:44:18.754192] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:34.063 [2024-07-15 07:44:18.754204] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:34.063 [2024-07-15 07:44:18.754239] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:34.063 00:08:34.063 [2024-07-15 07:44:18.754259] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:34.323 00:08:34.323 real 0m0.725s 00:08:34.323 user 0m0.468s 00:08:34.323 sys 0m0.202s 00:08:34.323 07:44:18 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.323 07:44:18 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:34.323 ************************************ 00:08:34.323 END TEST bdev_hello_world 00:08:34.323 ************************************ 00:08:34.323 07:44:18 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:34.323 07:44:18 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:34.323 07:44:18 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:34.323 07:44:18 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.323 07:44:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:34.323 ************************************ 00:08:34.323 START TEST bdev_bounds 00:08:34.323 ************************************ 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1564742 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1564742' 00:08:34.323 Process bdevio pid: 1564742 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1564742 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1564742 ']' 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:34.323 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:34.323 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:34.323 [2024-07-15 07:44:19.057818] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:34.323 [2024-07-15 07:44:19.057862] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1564742 ] 00:08:34.584 [2024-07-15 07:44:19.144497] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:34.584 [2024-07-15 07:44:19.209520] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:34.584 [2024-07-15 07:44:19.209664] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.584 [2024-07-15 07:44:19.209664] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:34.584 [2024-07-15 07:44:19.329713] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:34.584 [2024-07-15 07:44:19.329755] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:34.584 [2024-07-15 07:44:19.329763] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:34.584 [2024-07-15 07:44:19.337722] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:34.584 [2024-07-15 07:44:19.337739] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:34.844 [2024-07-15 07:44:19.345737] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:34.844 [2024-07-15 07:44:19.345753] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:34.844 [2024-07-15 07:44:19.406562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:34.844 [2024-07-15 07:44:19.406599] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:34.844 [2024-07-15 07:44:19.406609] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1980ff0 00:08:34.844 [2024-07-15 07:44:19.406616] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:34.844 [2024-07-15 07:44:19.407789] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:34.844 [2024-07-15 07:44:19.407808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:35.414 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:35.414 07:44:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:08:35.414 07:44:19 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:35.414 I/O targets: 00:08:35.414 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:35.414 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:35.414 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:35.414 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:35.414 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:35.414 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:35.414 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:35.414 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:35.414 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:35.414 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:35.414 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:35.414 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:35.414 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:35.414 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:35.414 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:35.414 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:35.414 00:08:35.414 00:08:35.414 CUnit - A unit testing framework for C - Version 2.1-3 00:08:35.414 http://cunit.sourceforge.net/ 00:08:35.414 00:08:35.414 00:08:35.414 Suite: bdevio tests on: AIO0 00:08:35.414 Test: blockdev write read block ...passed 00:08:35.415 Test: blockdev write zeroes read block ...passed 00:08:35.415 Test: blockdev write zeroes read no split ...passed 00:08:35.415 Test: blockdev write zeroes read split ...passed 00:08:35.415 Test: blockdev write zeroes read split partial ...passed 00:08:35.415 Test: blockdev reset ...passed 00:08:35.415 Test: blockdev write read 8 blocks ...passed 00:08:35.415 Test: blockdev write read size > 128k ...passed 00:08:35.415 Test: blockdev write read invalid size ...passed 00:08:35.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.415 Test: blockdev write read max offset ...passed 00:08:35.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.415 Test: blockdev writev readv 8 blocks ...passed 00:08:35.415 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.415 Test: blockdev writev readv block ...passed 00:08:35.415 Test: blockdev writev readv size > 128k ...passed 00:08:35.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.415 Test: blockdev comparev and writev ...passed 00:08:35.415 Test: blockdev nvme passthru rw ...passed 00:08:35.415 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.415 Test: blockdev nvme admin passthru ...passed 00:08:35.415 Test: blockdev copy ...passed 00:08:35.415 Suite: bdevio tests on: raid1 00:08:35.415 Test: blockdev write read block ...passed 00:08:35.415 Test: blockdev write zeroes read block ...passed 00:08:35.415 Test: blockdev write zeroes read no split ...passed 00:08:35.415 Test: blockdev write zeroes read split ...passed 00:08:35.415 Test: blockdev write zeroes read split partial ...passed 00:08:35.415 Test: blockdev reset ...passed 00:08:35.415 Test: blockdev write read 8 blocks ...passed 00:08:35.415 Test: blockdev write read size > 128k ...passed 00:08:35.415 Test: blockdev write read invalid size ...passed 00:08:35.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.415 Test: blockdev write read max offset ...passed 00:08:35.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.415 Test: blockdev writev readv 8 blocks ...passed 00:08:35.415 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.415 Test: blockdev writev readv block ...passed 00:08:35.415 Test: blockdev writev readv size > 128k ...passed 00:08:35.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.415 Test: blockdev comparev and writev ...passed 00:08:35.415 Test: blockdev nvme passthru rw ...passed 00:08:35.415 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.415 Test: blockdev nvme admin passthru ...passed 00:08:35.415 Test: blockdev copy ...passed 00:08:35.415 Suite: bdevio tests on: concat0 00:08:35.415 Test: blockdev write read block ...passed 00:08:35.415 Test: blockdev write zeroes read block ...passed 00:08:35.415 Test: blockdev write zeroes read no split ...passed 00:08:35.415 Test: blockdev write zeroes read split ...passed 00:08:35.415 Test: blockdev write zeroes read split partial ...passed 00:08:35.415 Test: blockdev reset ...passed 00:08:35.415 Test: blockdev write read 8 blocks ...passed 00:08:35.415 Test: blockdev write read size > 128k ...passed 00:08:35.415 Test: blockdev write read invalid size ...passed 00:08:35.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.415 Test: blockdev write read max offset ...passed 00:08:35.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.415 Test: blockdev writev readv 8 blocks ...passed 00:08:35.415 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.415 Test: blockdev writev readv block ...passed 00:08:35.415 Test: blockdev writev readv size > 128k ...passed 00:08:35.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.415 Test: blockdev comparev and writev ...passed 00:08:35.415 Test: blockdev nvme passthru rw ...passed 00:08:35.415 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.415 Test: blockdev nvme admin passthru ...passed 00:08:35.415 Test: blockdev copy ...passed 00:08:35.415 Suite: bdevio tests on: raid0 00:08:35.415 Test: blockdev write read block ...passed 00:08:35.415 Test: blockdev write zeroes read block ...passed 00:08:35.415 Test: blockdev write zeroes read no split ...passed 00:08:35.415 Test: blockdev write zeroes read split ...passed 00:08:35.415 Test: blockdev write zeroes read split partial ...passed 00:08:35.415 Test: blockdev reset ...passed 00:08:35.415 Test: blockdev write read 8 blocks ...passed 00:08:35.415 Test: blockdev write read size > 128k ...passed 00:08:35.415 Test: blockdev write read invalid size ...passed 00:08:35.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.415 Test: blockdev write read max offset ...passed 00:08:35.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.415 Test: blockdev writev readv 8 blocks ...passed 00:08:35.415 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.415 Test: blockdev writev readv block ...passed 00:08:35.415 Test: blockdev writev readv size > 128k ...passed 00:08:35.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.415 Test: blockdev comparev and writev ...passed 00:08:35.415 Test: blockdev nvme passthru rw ...passed 00:08:35.415 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.415 Test: blockdev nvme admin passthru ...passed 00:08:35.415 Test: blockdev copy ...passed 00:08:35.415 Suite: bdevio tests on: TestPT 00:08:35.415 Test: blockdev write read block ...passed 00:08:35.415 Test: blockdev write zeroes read block ...passed 00:08:35.415 Test: blockdev write zeroes read no split ...passed 00:08:35.415 Test: blockdev write zeroes read split ...passed 00:08:35.415 Test: blockdev write zeroes read split partial ...passed 00:08:35.415 Test: blockdev reset ...passed 00:08:35.415 Test: blockdev write read 8 blocks ...passed 00:08:35.415 Test: blockdev write read size > 128k ...passed 00:08:35.415 Test: blockdev write read invalid size ...passed 00:08:35.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.415 Test: blockdev write read max offset ...passed 00:08:35.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.415 Test: blockdev writev readv 8 blocks ...passed 00:08:35.415 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.415 Test: blockdev writev readv block ...passed 00:08:35.415 Test: blockdev writev readv size > 128k ...passed 00:08:35.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.415 Test: blockdev comparev and writev ...passed 00:08:35.415 Test: blockdev nvme passthru rw ...passed 00:08:35.415 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.415 Test: blockdev nvme admin passthru ...passed 00:08:35.415 Test: blockdev copy ...passed 00:08:35.415 Suite: bdevio tests on: Malloc2p7 00:08:35.415 Test: blockdev write read block ...passed 00:08:35.415 Test: blockdev write zeroes read block ...passed 00:08:35.415 Test: blockdev write zeroes read no split ...passed 00:08:35.415 Test: blockdev write zeroes read split ...passed 00:08:35.415 Test: blockdev write zeroes read split partial ...passed 00:08:35.415 Test: blockdev reset ...passed 00:08:35.415 Test: blockdev write read 8 blocks ...passed 00:08:35.415 Test: blockdev write read size > 128k ...passed 00:08:35.415 Test: blockdev write read invalid size ...passed 00:08:35.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.415 Test: blockdev write read max offset ...passed 00:08:35.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.415 Test: blockdev writev readv 8 blocks ...passed 00:08:35.415 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.415 Test: blockdev writev readv block ...passed 00:08:35.415 Test: blockdev writev readv size > 128k ...passed 00:08:35.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.415 Test: blockdev comparev and writev ...passed 00:08:35.415 Test: blockdev nvme passthru rw ...passed 00:08:35.415 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.415 Test: blockdev nvme admin passthru ...passed 00:08:35.415 Test: blockdev copy ...passed 00:08:35.415 Suite: bdevio tests on: Malloc2p6 00:08:35.415 Test: blockdev write read block ...passed 00:08:35.415 Test: blockdev write zeroes read block ...passed 00:08:35.415 Test: blockdev write zeroes read no split ...passed 00:08:35.415 Test: blockdev write zeroes read split ...passed 00:08:35.415 Test: blockdev write zeroes read split partial ...passed 00:08:35.415 Test: blockdev reset ...passed 00:08:35.415 Test: blockdev write read 8 blocks ...passed 00:08:35.415 Test: blockdev write read size > 128k ...passed 00:08:35.415 Test: blockdev write read invalid size ...passed 00:08:35.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.415 Test: blockdev write read max offset ...passed 00:08:35.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.415 Test: blockdev writev readv 8 blocks ...passed 00:08:35.415 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.415 Test: blockdev writev readv block ...passed 00:08:35.415 Test: blockdev writev readv size > 128k ...passed 00:08:35.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.415 Test: blockdev comparev and writev ...passed 00:08:35.415 Test: blockdev nvme passthru rw ...passed 00:08:35.415 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.415 Test: blockdev nvme admin passthru ...passed 00:08:35.415 Test: blockdev copy ...passed 00:08:35.415 Suite: bdevio tests on: Malloc2p5 00:08:35.415 Test: blockdev write read block ...passed 00:08:35.415 Test: blockdev write zeroes read block ...passed 00:08:35.415 Test: blockdev write zeroes read no split ...passed 00:08:35.415 Test: blockdev write zeroes read split ...passed 00:08:35.415 Test: blockdev write zeroes read split partial ...passed 00:08:35.415 Test: blockdev reset ...passed 00:08:35.415 Test: blockdev write read 8 blocks ...passed 00:08:35.415 Test: blockdev write read size > 128k ...passed 00:08:35.415 Test: blockdev write read invalid size ...passed 00:08:35.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.415 Test: blockdev write read max offset ...passed 00:08:35.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.415 Test: blockdev writev readv 8 blocks ...passed 00:08:35.415 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.415 Test: blockdev writev readv block ...passed 00:08:35.415 Test: blockdev writev readv size > 128k ...passed 00:08:35.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.415 Test: blockdev comparev and writev ...passed 00:08:35.415 Test: blockdev nvme passthru rw ...passed 00:08:35.415 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.415 Test: blockdev nvme admin passthru ...passed 00:08:35.416 Test: blockdev copy ...passed 00:08:35.416 Suite: bdevio tests on: Malloc2p4 00:08:35.416 Test: blockdev write read block ...passed 00:08:35.416 Test: blockdev write zeroes read block ...passed 00:08:35.416 Test: blockdev write zeroes read no split ...passed 00:08:35.416 Test: blockdev write zeroes read split ...passed 00:08:35.416 Test: blockdev write zeroes read split partial ...passed 00:08:35.416 Test: blockdev reset ...passed 00:08:35.416 Test: blockdev write read 8 blocks ...passed 00:08:35.416 Test: blockdev write read size > 128k ...passed 00:08:35.416 Test: blockdev write read invalid size ...passed 00:08:35.416 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.416 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.416 Test: blockdev write read max offset ...passed 00:08:35.416 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.416 Test: blockdev writev readv 8 blocks ...passed 00:08:35.416 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.416 Test: blockdev writev readv block ...passed 00:08:35.416 Test: blockdev writev readv size > 128k ...passed 00:08:35.416 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.416 Test: blockdev comparev and writev ...passed 00:08:35.416 Test: blockdev nvme passthru rw ...passed 00:08:35.416 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.416 Test: blockdev nvme admin passthru ...passed 00:08:35.416 Test: blockdev copy ...passed 00:08:35.416 Suite: bdevio tests on: Malloc2p3 00:08:35.416 Test: blockdev write read block ...passed 00:08:35.416 Test: blockdev write zeroes read block ...passed 00:08:35.416 Test: blockdev write zeroes read no split ...passed 00:08:35.416 Test: blockdev write zeroes read split ...passed 00:08:35.677 Test: blockdev write zeroes read split partial ...passed 00:08:35.677 Test: blockdev reset ...passed 00:08:35.677 Test: blockdev write read 8 blocks ...passed 00:08:35.677 Test: blockdev write read size > 128k ...passed 00:08:35.677 Test: blockdev write read invalid size ...passed 00:08:35.677 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.677 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.677 Test: blockdev write read max offset ...passed 00:08:35.677 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.677 Test: blockdev writev readv 8 blocks ...passed 00:08:35.677 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.677 Test: blockdev writev readv block ...passed 00:08:35.677 Test: blockdev writev readv size > 128k ...passed 00:08:35.677 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.677 Test: blockdev comparev and writev ...passed 00:08:35.677 Test: blockdev nvme passthru rw ...passed 00:08:35.677 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.677 Test: blockdev nvme admin passthru ...passed 00:08:35.677 Test: blockdev copy ...passed 00:08:35.677 Suite: bdevio tests on: Malloc2p2 00:08:35.677 Test: blockdev write read block ...passed 00:08:35.677 Test: blockdev write zeroes read block ...passed 00:08:35.677 Test: blockdev write zeroes read no split ...passed 00:08:35.677 Test: blockdev write zeroes read split ...passed 00:08:35.677 Test: blockdev write zeroes read split partial ...passed 00:08:35.677 Test: blockdev reset ...passed 00:08:35.677 Test: blockdev write read 8 blocks ...passed 00:08:35.677 Test: blockdev write read size > 128k ...passed 00:08:35.677 Test: blockdev write read invalid size ...passed 00:08:35.677 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.677 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.677 Test: blockdev write read max offset ...passed 00:08:35.677 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.677 Test: blockdev writev readv 8 blocks ...passed 00:08:35.677 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.677 Test: blockdev writev readv block ...passed 00:08:35.677 Test: blockdev writev readv size > 128k ...passed 00:08:35.677 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.677 Test: blockdev comparev and writev ...passed 00:08:35.677 Test: blockdev nvme passthru rw ...passed 00:08:35.677 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.677 Test: blockdev nvme admin passthru ...passed 00:08:35.677 Test: blockdev copy ...passed 00:08:35.677 Suite: bdevio tests on: Malloc2p1 00:08:35.677 Test: blockdev write read block ...passed 00:08:35.677 Test: blockdev write zeroes read block ...passed 00:08:35.677 Test: blockdev write zeroes read no split ...passed 00:08:35.677 Test: blockdev write zeroes read split ...passed 00:08:35.677 Test: blockdev write zeroes read split partial ...passed 00:08:35.677 Test: blockdev reset ...passed 00:08:35.677 Test: blockdev write read 8 blocks ...passed 00:08:35.677 Test: blockdev write read size > 128k ...passed 00:08:35.677 Test: blockdev write read invalid size ...passed 00:08:35.677 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.677 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.677 Test: blockdev write read max offset ...passed 00:08:35.677 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.677 Test: blockdev writev readv 8 blocks ...passed 00:08:35.677 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.677 Test: blockdev writev readv block ...passed 00:08:35.677 Test: blockdev writev readv size > 128k ...passed 00:08:35.677 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.677 Test: blockdev comparev and writev ...passed 00:08:35.677 Test: blockdev nvme passthru rw ...passed 00:08:35.677 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.677 Test: blockdev nvme admin passthru ...passed 00:08:35.677 Test: blockdev copy ...passed 00:08:35.677 Suite: bdevio tests on: Malloc2p0 00:08:35.677 Test: blockdev write read block ...passed 00:08:35.677 Test: blockdev write zeroes read block ...passed 00:08:35.677 Test: blockdev write zeroes read no split ...passed 00:08:35.677 Test: blockdev write zeroes read split ...passed 00:08:35.677 Test: blockdev write zeroes read split partial ...passed 00:08:35.677 Test: blockdev reset ...passed 00:08:35.677 Test: blockdev write read 8 blocks ...passed 00:08:35.677 Test: blockdev write read size > 128k ...passed 00:08:35.677 Test: blockdev write read invalid size ...passed 00:08:35.677 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.677 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.677 Test: blockdev write read max offset ...passed 00:08:35.677 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.677 Test: blockdev writev readv 8 blocks ...passed 00:08:35.677 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.677 Test: blockdev writev readv block ...passed 00:08:35.677 Test: blockdev writev readv size > 128k ...passed 00:08:35.677 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.677 Test: blockdev comparev and writev ...passed 00:08:35.677 Test: blockdev nvme passthru rw ...passed 00:08:35.677 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.677 Test: blockdev nvme admin passthru ...passed 00:08:35.677 Test: blockdev copy ...passed 00:08:35.677 Suite: bdevio tests on: Malloc1p1 00:08:35.677 Test: blockdev write read block ...passed 00:08:35.677 Test: blockdev write zeroes read block ...passed 00:08:35.677 Test: blockdev write zeroes read no split ...passed 00:08:35.677 Test: blockdev write zeroes read split ...passed 00:08:35.677 Test: blockdev write zeroes read split partial ...passed 00:08:35.677 Test: blockdev reset ...passed 00:08:35.677 Test: blockdev write read 8 blocks ...passed 00:08:35.677 Test: blockdev write read size > 128k ...passed 00:08:35.677 Test: blockdev write read invalid size ...passed 00:08:35.677 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.677 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.677 Test: blockdev write read max offset ...passed 00:08:35.677 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.677 Test: blockdev writev readv 8 blocks ...passed 00:08:35.677 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.677 Test: blockdev writev readv block ...passed 00:08:35.677 Test: blockdev writev readv size > 128k ...passed 00:08:35.677 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.677 Test: blockdev comparev and writev ...passed 00:08:35.677 Test: blockdev nvme passthru rw ...passed 00:08:35.678 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.678 Test: blockdev nvme admin passthru ...passed 00:08:35.678 Test: blockdev copy ...passed 00:08:35.678 Suite: bdevio tests on: Malloc1p0 00:08:35.678 Test: blockdev write read block ...passed 00:08:35.678 Test: blockdev write zeroes read block ...passed 00:08:35.678 Test: blockdev write zeroes read no split ...passed 00:08:35.678 Test: blockdev write zeroes read split ...passed 00:08:35.678 Test: blockdev write zeroes read split partial ...passed 00:08:35.678 Test: blockdev reset ...passed 00:08:35.678 Test: blockdev write read 8 blocks ...passed 00:08:35.678 Test: blockdev write read size > 128k ...passed 00:08:35.678 Test: blockdev write read invalid size ...passed 00:08:35.678 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.678 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.678 Test: blockdev write read max offset ...passed 00:08:35.678 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.678 Test: blockdev writev readv 8 blocks ...passed 00:08:35.678 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.678 Test: blockdev writev readv block ...passed 00:08:35.678 Test: blockdev writev readv size > 128k ...passed 00:08:35.678 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.678 Test: blockdev comparev and writev ...passed 00:08:35.678 Test: blockdev nvme passthru rw ...passed 00:08:35.678 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.678 Test: blockdev nvme admin passthru ...passed 00:08:35.678 Test: blockdev copy ...passed 00:08:35.678 Suite: bdevio tests on: Malloc0 00:08:35.678 Test: blockdev write read block ...passed 00:08:35.678 Test: blockdev write zeroes read block ...passed 00:08:35.678 Test: blockdev write zeroes read no split ...passed 00:08:35.678 Test: blockdev write zeroes read split ...passed 00:08:35.678 Test: blockdev write zeroes read split partial ...passed 00:08:35.678 Test: blockdev reset ...passed 00:08:35.678 Test: blockdev write read 8 blocks ...passed 00:08:35.678 Test: blockdev write read size > 128k ...passed 00:08:35.678 Test: blockdev write read invalid size ...passed 00:08:35.678 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:35.678 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:35.678 Test: blockdev write read max offset ...passed 00:08:35.678 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:35.678 Test: blockdev writev readv 8 blocks ...passed 00:08:35.678 Test: blockdev writev readv 30 x 1block ...passed 00:08:35.678 Test: blockdev writev readv block ...passed 00:08:35.678 Test: blockdev writev readv size > 128k ...passed 00:08:35.678 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:35.678 Test: blockdev comparev and writev ...passed 00:08:35.678 Test: blockdev nvme passthru rw ...passed 00:08:35.678 Test: blockdev nvme passthru vendor specific ...passed 00:08:35.678 Test: blockdev nvme admin passthru ...passed 00:08:35.678 Test: blockdev copy ...passed 00:08:35.678 00:08:35.678 Run Summary: Type Total Ran Passed Failed Inactive 00:08:35.678 suites 16 16 n/a 0 0 00:08:35.678 tests 368 368 368 0 0 00:08:35.678 asserts 2224 2224 2224 0 n/a 00:08:35.678 00:08:35.678 Elapsed time = 0.640 seconds 00:08:35.678 0 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1564742 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1564742 ']' 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1564742 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1564742 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1564742' 00:08:35.678 killing process with pid 1564742 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1564742 00:08:35.678 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1564742 00:08:35.938 07:44:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:35.938 00:08:35.938 real 0m1.505s 00:08:35.938 user 0m3.810s 00:08:35.938 sys 0m0.329s 00:08:35.938 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:35.938 07:44:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:35.938 ************************************ 00:08:35.938 END TEST bdev_bounds 00:08:35.938 ************************************ 00:08:35.938 07:44:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:35.938 07:44:20 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:35.938 07:44:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:35.938 07:44:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:35.938 07:44:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:35.938 ************************************ 00:08:35.938 START TEST bdev_nbd 00:08:35.938 ************************************ 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:35.938 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1565078 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1565078 /var/tmp/spdk-nbd.sock 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1565078 ']' 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:35.939 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:35.939 07:44:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:35.939 [2024-07-15 07:44:20.647777] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:08:35.939 [2024-07-15 07:44:20.647825] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:36.198 [2024-07-15 07:44:20.737140] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.198 [2024-07-15 07:44:20.802654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.198 [2024-07-15 07:44:20.920075] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:36.198 [2024-07-15 07:44:20.920118] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:36.198 [2024-07-15 07:44:20.920127] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:36.198 [2024-07-15 07:44:20.928082] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:36.199 [2024-07-15 07:44:20.928099] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:36.199 [2024-07-15 07:44:20.936094] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:36.199 [2024-07-15 07:44:20.936109] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:36.458 [2024-07-15 07:44:20.996915] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:36.458 [2024-07-15 07:44:20.996951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:36.458 [2024-07-15 07:44:20.996962] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10c8a00 00:08:36.458 [2024-07-15 07:44:20.996968] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:36.458 [2024-07-15 07:44:20.998102] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:36.458 [2024-07-15 07:44:20.998122] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:36.719 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:36.981 1+0 records in 00:08:36.981 1+0 records out 00:08:36.981 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246427 s, 16.6 MB/s 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:36.981 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.243 1+0 records in 00:08:37.243 1+0 records out 00:08:37.243 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244784 s, 16.7 MB/s 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:37.243 07:44:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.505 1+0 records in 00:08:37.505 1+0 records out 00:08:37.505 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021376 s, 19.2 MB/s 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:37.505 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:37.766 1+0 records in 00:08:37.766 1+0 records out 00:08:37.766 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000302635 s, 13.5 MB/s 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:37.766 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.026 1+0 records in 00:08:38.026 1+0 records out 00:08:38.026 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292585 s, 14.0 MB/s 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.026 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.287 1+0 records in 00:08:38.287 1+0 records out 00:08:38.287 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273557 s, 15.0 MB/s 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.287 07:44:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.287 1+0 records in 00:08:38.287 1+0 records out 00:08:38.287 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272632 s, 15.0 MB/s 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.287 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.547 1+0 records in 00:08:38.547 1+0 records out 00:08:38.547 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309314 s, 13.2 MB/s 00:08:38.547 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.548 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.548 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.548 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.548 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.548 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.548 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.548 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:38.807 1+0 records in 00:08:38.807 1+0 records out 00:08:38.807 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000442002 s, 9.3 MB/s 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:38.807 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.067 1+0 records in 00:08:39.067 1+0 records out 00:08:39.067 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000345043 s, 11.9 MB/s 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:39.067 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.326 1+0 records in 00:08:39.326 1+0 records out 00:08:39.326 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000409537 s, 10.0 MB/s 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:39.326 07:44:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.596 1+0 records in 00:08:39.596 1+0 records out 00:08:39.596 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311966 s, 13.1 MB/s 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:39.596 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.902 1+0 records in 00:08:39.902 1+0 records out 00:08:39.902 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000399524 s, 10.3 MB/s 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:39.902 1+0 records in 00:08:39.902 1+0 records out 00:08:39.902 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000477425 s, 8.6 MB/s 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:39.902 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:40.163 1+0 records in 00:08:40.163 1+0 records out 00:08:40.163 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000491846 s, 8.3 MB/s 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:40.163 07:44:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:40.423 1+0 records in 00:08:40.423 1+0 records out 00:08:40.423 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000414457 s, 9.9 MB/s 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:40.423 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:40.682 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd0", 00:08:40.683 "bdev_name": "Malloc0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd1", 00:08:40.683 "bdev_name": "Malloc1p0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd2", 00:08:40.683 "bdev_name": "Malloc1p1" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd3", 00:08:40.683 "bdev_name": "Malloc2p0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd4", 00:08:40.683 "bdev_name": "Malloc2p1" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd5", 00:08:40.683 "bdev_name": "Malloc2p2" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd6", 00:08:40.683 "bdev_name": "Malloc2p3" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd7", 00:08:40.683 "bdev_name": "Malloc2p4" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd8", 00:08:40.683 "bdev_name": "Malloc2p5" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd9", 00:08:40.683 "bdev_name": "Malloc2p6" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd10", 00:08:40.683 "bdev_name": "Malloc2p7" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd11", 00:08:40.683 "bdev_name": "TestPT" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd12", 00:08:40.683 "bdev_name": "raid0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd13", 00:08:40.683 "bdev_name": "concat0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd14", 00:08:40.683 "bdev_name": "raid1" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd15", 00:08:40.683 "bdev_name": "AIO0" 00:08:40.683 } 00:08:40.683 ]' 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd0", 00:08:40.683 "bdev_name": "Malloc0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd1", 00:08:40.683 "bdev_name": "Malloc1p0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd2", 00:08:40.683 "bdev_name": "Malloc1p1" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd3", 00:08:40.683 "bdev_name": "Malloc2p0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd4", 00:08:40.683 "bdev_name": "Malloc2p1" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd5", 00:08:40.683 "bdev_name": "Malloc2p2" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd6", 00:08:40.683 "bdev_name": "Malloc2p3" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd7", 00:08:40.683 "bdev_name": "Malloc2p4" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd8", 00:08:40.683 "bdev_name": "Malloc2p5" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd9", 00:08:40.683 "bdev_name": "Malloc2p6" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd10", 00:08:40.683 "bdev_name": "Malloc2p7" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd11", 00:08:40.683 "bdev_name": "TestPT" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd12", 00:08:40.683 "bdev_name": "raid0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd13", 00:08:40.683 "bdev_name": "concat0" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd14", 00:08:40.683 "bdev_name": "raid1" 00:08:40.683 }, 00:08:40.683 { 00:08:40.683 "nbd_device": "/dev/nbd15", 00:08:40.683 "bdev_name": "AIO0" 00:08:40.683 } 00:08:40.683 ]' 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:40.683 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:40.942 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.201 07:44:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.460 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:41.717 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:41.717 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:41.717 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:41.717 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.717 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.717 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:41.718 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.718 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.718 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.718 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:41.718 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:41.976 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:42.234 07:44:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:42.546 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:42.805 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:43.064 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:43.064 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:43.064 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:43.064 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:43.064 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:43.064 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:43.065 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:43.065 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:43.065 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:43.065 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:43.324 07:44:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:43.324 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:43.584 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:43.843 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.103 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:44.362 /dev/nbd0 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.362 1+0 records in 00:08:44.362 1+0 records out 00:08:44.362 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260931 s, 15.7 MB/s 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.362 07:44:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:44.621 /dev/nbd1 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.621 1+0 records in 00:08:44.621 1+0 records out 00:08:44.621 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276201 s, 14.8 MB/s 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:44.621 /dev/nbd10 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:44.621 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.881 1+0 records in 00:08:44.881 1+0 records out 00:08:44.881 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026374 s, 15.5 MB/s 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:44.881 /dev/nbd11 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.881 1+0 records in 00:08:44.881 1+0 records out 00:08:44.881 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314318 s, 13.0 MB/s 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.881 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.882 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.882 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.882 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:44.882 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:45.141 /dev/nbd12 00:08:45.141 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:45.141 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.142 1+0 records in 00:08:45.142 1+0 records out 00:08:45.142 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289755 s, 14.1 MB/s 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:45.142 07:44:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:45.402 /dev/nbd13 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.402 1+0 records in 00:08:45.402 1+0 records out 00:08:45.402 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310345 s, 13.2 MB/s 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:45.402 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:45.663 /dev/nbd14 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.663 1+0 records in 00:08:45.663 1+0 records out 00:08:45.663 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304816 s, 13.4 MB/s 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:45.663 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:45.924 /dev/nbd15 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.924 1+0 records in 00:08:45.924 1+0 records out 00:08:45.924 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279018 s, 14.7 MB/s 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:45.924 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:46.184 /dev/nbd2 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.184 1+0 records in 00:08:46.184 1+0 records out 00:08:46.184 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334309 s, 12.3 MB/s 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.184 07:44:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:46.443 /dev/nbd3 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.443 1+0 records in 00:08:46.443 1+0 records out 00:08:46.443 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000350362 s, 11.7 MB/s 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.443 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:46.702 /dev/nbd4 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.703 1+0 records in 00:08:46.703 1+0 records out 00:08:46.703 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317123 s, 12.9 MB/s 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:46.703 /dev/nbd5 00:08:46.703 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.963 1+0 records in 00:08:46.963 1+0 records out 00:08:46.963 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000446599 s, 9.2 MB/s 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:46.963 /dev/nbd6 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.963 1+0 records in 00:08:46.963 1+0 records out 00:08:46.963 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000513629 s, 8.0 MB/s 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:46.963 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:47.223 /dev/nbd7 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:47.223 1+0 records in 00:08:47.223 1+0 records out 00:08:47.223 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00045808 s, 8.9 MB/s 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:47.223 07:44:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:47.483 /dev/nbd8 00:08:47.483 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:47.483 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:47.483 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:47.483 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:47.483 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:47.483 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:47.483 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:47.483 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:47.484 1+0 records in 00:08:47.484 1+0 records out 00:08:47.484 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000384731 s, 10.6 MB/s 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:47.484 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:47.744 /dev/nbd9 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:47.744 1+0 records in 00:08:47.744 1+0 records out 00:08:47.744 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000434705 s, 9.4 MB/s 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:47.744 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:48.004 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd0", 00:08:48.004 "bdev_name": "Malloc0" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd1", 00:08:48.004 "bdev_name": "Malloc1p0" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd10", 00:08:48.004 "bdev_name": "Malloc1p1" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd11", 00:08:48.004 "bdev_name": "Malloc2p0" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd12", 00:08:48.004 "bdev_name": "Malloc2p1" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd13", 00:08:48.004 "bdev_name": "Malloc2p2" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd14", 00:08:48.004 "bdev_name": "Malloc2p3" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd15", 00:08:48.004 "bdev_name": "Malloc2p4" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd2", 00:08:48.004 "bdev_name": "Malloc2p5" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd3", 00:08:48.004 "bdev_name": "Malloc2p6" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd4", 00:08:48.004 "bdev_name": "Malloc2p7" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd5", 00:08:48.004 "bdev_name": "TestPT" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd6", 00:08:48.004 "bdev_name": "raid0" 00:08:48.004 }, 00:08:48.004 { 00:08:48.004 "nbd_device": "/dev/nbd7", 00:08:48.004 "bdev_name": "concat0" 00:08:48.004 }, 00:08:48.004 { 00:08:48.005 "nbd_device": "/dev/nbd8", 00:08:48.005 "bdev_name": "raid1" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd9", 00:08:48.005 "bdev_name": "AIO0" 00:08:48.005 } 00:08:48.005 ]' 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd0", 00:08:48.005 "bdev_name": "Malloc0" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd1", 00:08:48.005 "bdev_name": "Malloc1p0" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd10", 00:08:48.005 "bdev_name": "Malloc1p1" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd11", 00:08:48.005 "bdev_name": "Malloc2p0" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd12", 00:08:48.005 "bdev_name": "Malloc2p1" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd13", 00:08:48.005 "bdev_name": "Malloc2p2" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd14", 00:08:48.005 "bdev_name": "Malloc2p3" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd15", 00:08:48.005 "bdev_name": "Malloc2p4" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd2", 00:08:48.005 "bdev_name": "Malloc2p5" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd3", 00:08:48.005 "bdev_name": "Malloc2p6" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd4", 00:08:48.005 "bdev_name": "Malloc2p7" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd5", 00:08:48.005 "bdev_name": "TestPT" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd6", 00:08:48.005 "bdev_name": "raid0" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd7", 00:08:48.005 "bdev_name": "concat0" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd8", 00:08:48.005 "bdev_name": "raid1" 00:08:48.005 }, 00:08:48.005 { 00:08:48.005 "nbd_device": "/dev/nbd9", 00:08:48.005 "bdev_name": "AIO0" 00:08:48.005 } 00:08:48.005 ]' 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:48.005 /dev/nbd1 00:08:48.005 /dev/nbd10 00:08:48.005 /dev/nbd11 00:08:48.005 /dev/nbd12 00:08:48.005 /dev/nbd13 00:08:48.005 /dev/nbd14 00:08:48.005 /dev/nbd15 00:08:48.005 /dev/nbd2 00:08:48.005 /dev/nbd3 00:08:48.005 /dev/nbd4 00:08:48.005 /dev/nbd5 00:08:48.005 /dev/nbd6 00:08:48.005 /dev/nbd7 00:08:48.005 /dev/nbd8 00:08:48.005 /dev/nbd9' 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:48.005 /dev/nbd1 00:08:48.005 /dev/nbd10 00:08:48.005 /dev/nbd11 00:08:48.005 /dev/nbd12 00:08:48.005 /dev/nbd13 00:08:48.005 /dev/nbd14 00:08:48.005 /dev/nbd15 00:08:48.005 /dev/nbd2 00:08:48.005 /dev/nbd3 00:08:48.005 /dev/nbd4 00:08:48.005 /dev/nbd5 00:08:48.005 /dev/nbd6 00:08:48.005 /dev/nbd7 00:08:48.005 /dev/nbd8 00:08:48.005 /dev/nbd9' 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:48.005 256+0 records in 00:08:48.005 256+0 records out 00:08:48.005 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114675 s, 91.4 MB/s 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:48.005 256+0 records in 00:08:48.005 256+0 records out 00:08:48.005 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0852023 s, 12.3 MB/s 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.005 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:48.264 256+0 records in 00:08:48.264 256+0 records out 00:08:48.264 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0828615 s, 12.7 MB/s 00:08:48.264 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.264 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:48.264 256+0 records in 00:08:48.264 256+0 records out 00:08:48.264 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0855922 s, 12.3 MB/s 00:08:48.264 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.264 07:44:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:48.264 256+0 records in 00:08:48.264 256+0 records out 00:08:48.264 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0844153 s, 12.4 MB/s 00:08:48.264 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.264 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:48.523 256+0 records in 00:08:48.523 256+0 records out 00:08:48.523 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.087043 s, 12.0 MB/s 00:08:48.523 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.523 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:48.523 256+0 records in 00:08:48.523 256+0 records out 00:08:48.523 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0827855 s, 12.7 MB/s 00:08:48.523 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.523 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:48.523 256+0 records in 00:08:48.523 256+0 records out 00:08:48.523 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0844414 s, 12.4 MB/s 00:08:48.523 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.523 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:48.783 256+0 records in 00:08:48.783 256+0 records out 00:08:48.783 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0919996 s, 11.4 MB/s 00:08:48.783 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.783 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:48.783 256+0 records in 00:08:48.783 256+0 records out 00:08:48.783 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0926349 s, 11.3 MB/s 00:08:48.783 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:48.783 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:49.043 256+0 records in 00:08:49.043 256+0 records out 00:08:49.043 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0901573 s, 11.6 MB/s 00:08:49.043 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:49.043 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:49.043 256+0 records in 00:08:49.043 256+0 records out 00:08:49.043 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0905077 s, 11.6 MB/s 00:08:49.043 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:49.043 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:49.043 256+0 records in 00:08:49.043 256+0 records out 00:08:49.043 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.089184 s, 11.8 MB/s 00:08:49.043 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:49.043 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:49.303 256+0 records in 00:08:49.303 256+0 records out 00:08:49.303 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0899015 s, 11.7 MB/s 00:08:49.303 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:49.303 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:49.303 256+0 records in 00:08:49.303 256+0 records out 00:08:49.303 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0884875 s, 11.8 MB/s 00:08:49.303 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:49.303 07:44:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:49.303 256+0 records in 00:08:49.303 256+0 records out 00:08:49.303 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0883349 s, 11.9 MB/s 00:08:49.303 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:49.303 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:08:49.563 256+0 records in 00:08:49.563 256+0 records out 00:08:49.563 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0832503 s, 12.6 MB/s 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:08:49.563 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.564 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.825 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.085 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:50.345 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.345 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.345 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.345 07:44:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.913 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.172 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:51.430 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:51.430 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:51.430 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:51.430 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.430 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.430 07:44:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:51.430 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.430 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.430 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.430 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.689 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.948 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.208 07:44:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.468 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.728 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:52.988 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:53.248 07:44:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:53.509 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:53.509 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:53.509 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:53.509 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:53.509 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:08:53.510 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:08:53.770 malloc_lvol_verify 00:08:53.770 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:08:54.030 13e7ec1d-0291-4538-b623-0c1b95db4c34 00:08:54.030 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:08:54.291 b794b4b4-9ecd-40f6-b782-9a64f67543ad 00:08:54.291 07:44:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:08:54.291 /dev/nbd0 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:08:54.291 mke2fs 1.46.5 (30-Dec-2021) 00:08:54.291 Discarding device blocks: 0/4096 done 00:08:54.291 Creating filesystem with 4096 1k blocks and 1024 inodes 00:08:54.291 00:08:54.291 Allocating group tables: 0/1 done 00:08:54.291 Writing inode tables: 0/1 done 00:08:54.291 Creating journal (1024 blocks): done 00:08:54.291 Writing superblocks and filesystem accounting information: 0/1 done 00:08:54.291 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.291 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1565078 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1565078 ']' 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1565078 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:54.551 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1565078 00:08:54.811 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:54.811 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:54.811 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1565078' 00:08:54.811 killing process with pid 1565078 00:08:54.811 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1565078 00:08:54.811 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1565078 00:08:54.811 07:44:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:08:54.811 00:08:54.811 real 0m18.954s 00:08:54.811 user 0m26.479s 00:08:54.811 sys 0m7.856s 00:08:54.811 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:54.811 07:44:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:54.811 ************************************ 00:08:54.811 END TEST bdev_nbd 00:08:54.811 ************************************ 00:08:55.070 07:44:39 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:55.070 07:44:39 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:08:55.070 07:44:39 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:08:55.070 07:44:39 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:08:55.070 07:44:39 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:08:55.070 07:44:39 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:55.070 07:44:39 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:55.070 07:44:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:55.070 ************************************ 00:08:55.070 START TEST bdev_fio 00:08:55.070 ************************************ 00:08:55.070 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:08:55.070 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:08:55.070 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:55.070 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:55.070 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:08:55.070 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:55.071 07:44:39 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:55.071 ************************************ 00:08:55.071 START TEST bdev_fio_rw_verify 00:08:55.071 ************************************ 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:55.071 07:44:39 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:55.648 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:55.648 fio-3.35 00:08:55.648 Starting 16 threads 00:09:07.908 00:09:07.908 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1569170: Mon Jul 15 07:44:50 2024 00:09:07.908 read: IOPS=106k, BW=414MiB/s (434MB/s)(4144MiB/10001msec) 00:09:07.908 slat (nsec): min=1797, max=1328.8k, avg=28640.00, stdev=18734.18 00:09:07.908 clat (usec): min=7, max=1638, avg=248.45, stdev=142.78 00:09:07.908 lat (usec): min=12, max=1665, avg=277.09, stdev=150.57 00:09:07.908 clat percentiles (usec): 00:09:07.908 | 50.000th=[ 237], 99.000th=[ 635], 99.900th=[ 824], 99.990th=[ 988], 00:09:07.908 | 99.999th=[ 1106] 00:09:07.908 write: IOPS=164k, BW=641MiB/s (672MB/s)(6333MiB/9882msec); 0 zone resets 00:09:07.908 slat (usec): min=4, max=433, avg=42.15, stdev=20.53 00:09:07.908 clat (usec): min=7, max=1433, avg=303.23, stdev=161.83 00:09:07.909 lat (usec): min=23, max=1552, avg=345.38, stdev=170.42 00:09:07.909 clat percentiles (usec): 00:09:07.909 | 50.000th=[ 285], 99.000th=[ 750], 99.900th=[ 947], 99.990th=[ 1106], 00:09:07.909 | 99.999th=[ 1237] 00:09:07.909 bw ( KiB/s): min=493712, max=754400, per=98.65%, avg=647410.58, stdev=5063.26, samples=304 00:09:07.909 iops : min=123428, max=188600, avg=161852.58, stdev=1265.82, samples=304 00:09:07.909 lat (usec) : 10=0.01%, 20=0.15%, 50=2.56%, 100=9.67%, 250=34.39% 00:09:07.909 lat (usec) : 500=43.73%, 750=8.82%, 1000=0.65% 00:09:07.909 lat (msec) : 2=0.03% 00:09:07.909 cpu : usr=99.38%, sys=0.27%, ctx=530, majf=0, minf=1139 00:09:07.909 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:07.909 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:07.909 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:07.909 issued rwts: total=1060829,1621338,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:07.909 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:07.909 00:09:07.909 Run status group 0 (all jobs): 00:09:07.909 READ: bw=414MiB/s (434MB/s), 414MiB/s-414MiB/s (434MB/s-434MB/s), io=4144MiB (4345MB), run=10001-10001msec 00:09:07.909 WRITE: bw=641MiB/s (672MB/s), 641MiB/s-641MiB/s (672MB/s-672MB/s), io=6333MiB (6641MB), run=9882-9882msec 00:09:07.909 00:09:07.909 real 0m11.301s 00:09:07.909 user 2m47.329s 00:09:07.909 sys 0m2.042s 00:09:07.909 07:44:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.909 07:44:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:07.909 ************************************ 00:09:07.909 END TEST bdev_fio_rw_verify 00:09:07.909 ************************************ 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:07.909 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:07.910 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "351068c2-f1e6-428f-8de2-b4eb7bf1de6c"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "351068c2-f1e6-428f-8de2-b4eb7bf1de6c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "b5331515-b439-569f-a399-0991749fcd85"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b5331515-b439-569f-a399-0991749fcd85",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "5e7a6265-14d8-5855-8ee9-35611f90f2f4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "5e7a6265-14d8-5855-8ee9-35611f90f2f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "5f196542-a252-5082-8eb3-2dce50cb2935"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5f196542-a252-5082-8eb3-2dce50cb2935",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "a3937c82-f699-550e-a283-e029799125a5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a3937c82-f699-550e-a283-e029799125a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "20bd3fc4-58a6-5ca6-9aa7-d4af40364bce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "20bd3fc4-58a6-5ca6-9aa7-d4af40364bce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "1832e6f7-1cf0-5de2-8ce0-9b93efbb4144"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1832e6f7-1cf0-5de2-8ce0-9b93efbb4144",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "19aefb44-3611-5292-977a-b48c85b3029a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "19aefb44-3611-5292-977a-b48c85b3029a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "e6e53a67-4be8-5574-bf92-44a1dcfba675"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e6e53a67-4be8-5574-bf92-44a1dcfba675",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "655fbb5b-1213-5a31-bbe6-f52e8d77903b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "655fbb5b-1213-5a31-bbe6-f52e8d77903b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b39264c5-79b2-5b75-bc55-5db42a07ced3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b39264c5-79b2-5b75-bc55-5db42a07ced3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f87f8afa-ebd0-5a3c-ac5a-bfc11141ee81"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f87f8afa-ebd0-5a3c-ac5a-bfc11141ee81",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "619d7da8-259a-435a-bdf8-6d8b9ce22576"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "619d7da8-259a-435a-bdf8-6d8b9ce22576",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "619d7da8-259a-435a-bdf8-6d8b9ce22576",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "52647842-8927-49ee-9770-d386e5104a01",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "963e37a6-61cf-4c09-98ca-4d856c71d152",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "313e6746-4168-4e2f-9133-74edaf3d95ad"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "313e6746-4168-4e2f-9133-74edaf3d95ad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "313e6746-4168-4e2f-9133-74edaf3d95ad",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "abb19b59-9928-4846-925e-9cf4c72a4c56",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "564a8074-7dbd-40a2-9201-6fa42b23db14",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "cbacb353-2df9-4e2f-8139-da8f76ccbe78"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cbacb353-2df9-4e2f-8139-da8f76ccbe78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "cbacb353-2df9-4e2f-8139-da8f76ccbe78",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "965a07a3-0ab5-4d68-ad40-da71a09e4e20",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "79040df3-f4c2-4dd7-bb4d-8c95d25d28f6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "25eee8d3-6854-4ff6-b358-93dfdc5335be"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "25eee8d3-6854-4ff6-b358-93dfdc5335be",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:07.910 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:07.910 Malloc1p0 00:09:07.910 Malloc1p1 00:09:07.910 Malloc2p0 00:09:07.910 Malloc2p1 00:09:07.910 Malloc2p2 00:09:07.910 Malloc2p3 00:09:07.910 Malloc2p4 00:09:07.910 Malloc2p5 00:09:07.910 Malloc2p6 00:09:07.910 Malloc2p7 00:09:07.910 TestPT 00:09:07.910 raid0 00:09:07.910 concat0 ]] 00:09:07.910 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "351068c2-f1e6-428f-8de2-b4eb7bf1de6c"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "351068c2-f1e6-428f-8de2-b4eb7bf1de6c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "b5331515-b439-569f-a399-0991749fcd85"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b5331515-b439-569f-a399-0991749fcd85",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "5e7a6265-14d8-5855-8ee9-35611f90f2f4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "5e7a6265-14d8-5855-8ee9-35611f90f2f4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "5f196542-a252-5082-8eb3-2dce50cb2935"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5f196542-a252-5082-8eb3-2dce50cb2935",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "a3937c82-f699-550e-a283-e029799125a5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a3937c82-f699-550e-a283-e029799125a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "20bd3fc4-58a6-5ca6-9aa7-d4af40364bce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "20bd3fc4-58a6-5ca6-9aa7-d4af40364bce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "1832e6f7-1cf0-5de2-8ce0-9b93efbb4144"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1832e6f7-1cf0-5de2-8ce0-9b93efbb4144",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "19aefb44-3611-5292-977a-b48c85b3029a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "19aefb44-3611-5292-977a-b48c85b3029a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "e6e53a67-4be8-5574-bf92-44a1dcfba675"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e6e53a67-4be8-5574-bf92-44a1dcfba675",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "655fbb5b-1213-5a31-bbe6-f52e8d77903b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "655fbb5b-1213-5a31-bbe6-f52e8d77903b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b39264c5-79b2-5b75-bc55-5db42a07ced3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b39264c5-79b2-5b75-bc55-5db42a07ced3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f87f8afa-ebd0-5a3c-ac5a-bfc11141ee81"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f87f8afa-ebd0-5a3c-ac5a-bfc11141ee81",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "619d7da8-259a-435a-bdf8-6d8b9ce22576"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "619d7da8-259a-435a-bdf8-6d8b9ce22576",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "619d7da8-259a-435a-bdf8-6d8b9ce22576",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "52647842-8927-49ee-9770-d386e5104a01",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "963e37a6-61cf-4c09-98ca-4d856c71d152",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "313e6746-4168-4e2f-9133-74edaf3d95ad"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "313e6746-4168-4e2f-9133-74edaf3d95ad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "313e6746-4168-4e2f-9133-74edaf3d95ad",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "abb19b59-9928-4846-925e-9cf4c72a4c56",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "564a8074-7dbd-40a2-9201-6fa42b23db14",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "cbacb353-2df9-4e2f-8139-da8f76ccbe78"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "cbacb353-2df9-4e2f-8139-da8f76ccbe78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "cbacb353-2df9-4e2f-8139-da8f76ccbe78",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "965a07a3-0ab5-4d68-ad40-da71a09e4e20",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "79040df3-f4c2-4dd7-bb4d-8c95d25d28f6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "25eee8d3-6854-4ff6-b358-93dfdc5335be"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "25eee8d3-6854-4ff6-b358-93dfdc5335be",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.912 07:44:51 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:07.912 ************************************ 00:09:07.912 START TEST bdev_fio_trim 00:09:07.912 ************************************ 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:07.912 07:44:51 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:07.912 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.912 fio-3.35 00:09:07.912 Starting 14 threads 00:09:17.900 00:09:17.900 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1571196: Mon Jul 15 07:45:02 2024 00:09:17.900 write: IOPS=173k, BW=677MiB/s (710MB/s)(6775MiB/10001msec); 0 zone resets 00:09:17.900 slat (usec): min=2, max=860, avg=27.43, stdev=13.38 00:09:17.900 clat (usec): min=22, max=2403, avg=213.25, stdev=83.34 00:09:17.900 lat (usec): min=34, max=2418, avg=240.68, stdev=86.85 00:09:17.900 clat percentiles (usec): 00:09:17.900 | 50.000th=[ 204], 99.000th=[ 437], 99.900th=[ 506], 99.990th=[ 578], 00:09:17.900 | 99.999th=[ 783] 00:09:17.900 bw ( KiB/s): min=617448, max=840898, per=100.00%, avg=696346.63, stdev=4817.64, samples=266 00:09:17.900 iops : min=154362, max=210223, avg=174086.58, stdev=1204.39, samples=266 00:09:17.900 trim: IOPS=173k, BW=677MiB/s (710MB/s)(6775MiB/10001msec); 0 zone resets 00:09:17.900 slat (usec): min=3, max=396, avg=17.87, stdev= 8.35 00:09:17.900 clat (usec): min=3, max=2419, avg=231.42, stdev=90.86 00:09:17.900 lat (usec): min=11, max=2439, avg=249.29, stdev=93.91 00:09:17.900 clat percentiles (usec): 00:09:17.900 | 50.000th=[ 225], 99.000th=[ 465], 99.900th=[ 529], 99.990th=[ 611], 00:09:17.900 | 99.999th=[ 775] 00:09:17.900 bw ( KiB/s): min=617416, max=840962, per=100.00%, avg=696347.05, stdev=4817.78, samples=266 00:09:17.900 iops : min=154352, max=210239, avg=174086.58, stdev=1204.44, samples=266 00:09:17.900 lat (usec) : 4=0.01%, 10=0.05%, 20=0.18%, 50=0.90%, 100=5.06% 00:09:17.900 lat (usec) : 250=59.46%, 500=34.15%, 750=0.21%, 1000=0.01% 00:09:17.900 lat (msec) : 2=0.01%, 4=0.01% 00:09:17.900 cpu : usr=99.69%, sys=0.00%, ctx=590, majf=0, minf=1066 00:09:17.900 IO depths : 1=12.3%, 2=24.6%, 4=50.1%, 8=13.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:17.900 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:17.900 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:17.900 issued rwts: total=0,1734488,1734492,0 short=0,0,0,0 dropped=0,0,0,0 00:09:17.900 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:17.900 00:09:17.900 Run status group 0 (all jobs): 00:09:17.900 WRITE: bw=677MiB/s (710MB/s), 677MiB/s-677MiB/s (710MB/s-710MB/s), io=6775MiB (7104MB), run=10001-10001msec 00:09:17.900 TRIM: bw=677MiB/s (710MB/s), 677MiB/s-677MiB/s (710MB/s-710MB/s), io=6775MiB (7104MB), run=10001-10001msec 00:09:17.900 00:09:17.900 real 0m11.277s 00:09:17.900 user 2m29.069s 00:09:17.900 sys 0m0.997s 00:09:17.900 07:45:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:17.900 07:45:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:17.900 ************************************ 00:09:17.900 END TEST bdev_fio_trim 00:09:17.900 ************************************ 00:09:17.900 07:45:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:17.900 07:45:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:17.900 07:45:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:17.900 07:45:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:17.900 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:17.900 07:45:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:17.900 00:09:17.900 real 0m22.935s 00:09:17.900 user 5m16.603s 00:09:17.900 sys 0m3.217s 00:09:17.900 07:45:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:17.900 07:45:02 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:17.900 ************************************ 00:09:17.900 END TEST bdev_fio 00:09:17.900 ************************************ 00:09:17.900 07:45:02 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:17.900 07:45:02 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:17.900 07:45:02 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:17.900 07:45:02 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:17.901 07:45:02 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.901 07:45:02 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:17.901 ************************************ 00:09:17.901 START TEST bdev_verify 00:09:17.901 ************************************ 00:09:17.901 07:45:02 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:18.160 [2024-07-15 07:45:02.684188] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:09:18.160 [2024-07-15 07:45:02.684240] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1573185 ] 00:09:18.160 [2024-07-15 07:45:02.773080] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:18.160 [2024-07-15 07:45:02.851661] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:18.160 [2024-07-15 07:45:02.851666] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.421 [2024-07-15 07:45:02.996360] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:18.421 [2024-07-15 07:45:02.996417] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:18.421 [2024-07-15 07:45:02.996426] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:18.421 [2024-07-15 07:45:03.004365] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:18.421 [2024-07-15 07:45:03.004386] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:18.421 [2024-07-15 07:45:03.012375] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:18.421 [2024-07-15 07:45:03.012392] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:18.421 [2024-07-15 07:45:03.087458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:18.421 [2024-07-15 07:45:03.087519] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:18.421 [2024-07-15 07:45:03.087531] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b6990 00:09:18.421 [2024-07-15 07:45:03.087539] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:18.421 [2024-07-15 07:45:03.089079] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:18.421 [2024-07-15 07:45:03.089114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:18.680 Running I/O for 5 seconds... 00:09:23.961 00:09:23.961 Latency(us) 00:09:23.961 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:23.961 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x0 length 0x1000 00:09:23.961 Malloc0 : 5.07 1362.65 5.32 0.00 0.00 93722.23 422.20 358129.03 00:09:23.961 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x1000 length 0x1000 00:09:23.961 Malloc0 : 5.12 1200.58 4.69 0.00 0.00 106377.20 504.12 380713.75 00:09:23.961 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x0 length 0x800 00:09:23.961 Malloc1p0 : 5.07 706.32 2.76 0.00 0.00 180181.25 2142.52 186323.89 00:09:23.961 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x800 length 0x800 00:09:23.961 Malloc1p0 : 5.12 625.05 2.44 0.00 0.00 203568.48 2558.42 199229.44 00:09:23.961 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x0 length 0x800 00:09:23.961 Malloc1p1 : 5.08 706.09 2.76 0.00 0.00 179716.86 2041.70 175838.13 00:09:23.961 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x800 length 0x800 00:09:23.961 Malloc1p1 : 5.12 624.80 2.44 0.00 0.00 203002.42 2445.00 197616.25 00:09:23.961 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x0 length 0x200 00:09:23.961 Malloc2p0 : 5.17 717.50 2.80 0.00 0.00 176427.28 1890.46 165352.37 00:09:23.961 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x200 length 0x200 00:09:23.961 Malloc2p0 : 5.12 624.55 2.44 0.00 0.00 202504.82 2356.78 196809.65 00:09:23.961 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x0 length 0x200 00:09:23.961 Malloc2p1 : 5.18 717.27 2.80 0.00 0.00 176104.59 1877.86 168578.76 00:09:23.961 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x200 length 0x200 00:09:23.961 Malloc2p1 : 5.13 624.30 2.44 0.00 0.00 202099.35 2356.78 196003.05 00:09:23.961 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x0 length 0x200 00:09:23.961 Malloc2p2 : 5.18 717.05 2.80 0.00 0.00 175785.07 1915.67 172611.74 00:09:23.961 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x200 length 0x200 00:09:23.961 Malloc2p2 : 5.13 624.05 2.44 0.00 0.00 201744.43 2369.38 197616.25 00:09:23.961 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x0 length 0x200 00:09:23.961 Malloc2p3 : 5.18 716.82 2.80 0.00 0.00 175479.47 1928.27 175031.53 00:09:23.961 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.961 Verification LBA range: start 0x200 length 0x200 00:09:23.962 Malloc2p3 : 5.21 638.60 2.49 0.00 0.00 196737.64 2508.01 196809.65 00:09:23.962 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x200 00:09:23.962 Malloc2p4 : 5.18 716.59 2.80 0.00 0.00 175176.19 2029.10 175031.53 00:09:23.962 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x200 length 0x200 00:09:23.962 Malloc2p4 : 5.22 637.99 2.49 0.00 0.00 196382.77 2583.63 196003.05 00:09:23.962 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x200 00:09:23.962 Malloc2p5 : 5.18 716.36 2.80 0.00 0.00 174799.33 2142.52 173418.34 00:09:23.962 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x200 length 0x200 00:09:23.962 Malloc2p5 : 5.22 637.50 2.49 0.00 0.00 195936.85 2709.66 187937.08 00:09:23.962 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x200 00:09:23.962 Malloc2p6 : 5.18 716.13 2.80 0.00 0.00 174352.35 2243.35 170191.95 00:09:23.962 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x200 length 0x200 00:09:23.962 Malloc2p6 : 5.22 637.25 2.49 0.00 0.00 195339.82 2621.44 183097.50 00:09:23.962 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x200 00:09:23.962 Malloc2p7 : 5.19 715.90 2.80 0.00 0.00 173871.69 2180.33 167772.16 00:09:23.962 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x200 length 0x200 00:09:23.962 Malloc2p7 : 5.22 637.01 2.49 0.00 0.00 194892.55 2293.76 183097.50 00:09:23.962 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x1000 00:09:23.962 TestPT : 5.20 713.38 2.79 0.00 0.00 173953.66 9679.16 167772.16 00:09:23.962 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x1000 length 0x1000 00:09:23.962 TestPT : 5.20 615.44 2.40 0.00 0.00 201193.06 34280.37 183904.10 00:09:23.962 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x2000 00:09:23.962 raid0 : 5.19 715.41 2.79 0.00 0.00 173156.03 1903.06 164545.77 00:09:23.962 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x2000 length 0x2000 00:09:23.962 raid0 : 5.23 636.76 2.49 0.00 0.00 193999.45 2508.01 170998.55 00:09:23.962 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x2000 00:09:23.962 concat0 : 5.21 736.81 2.88 0.00 0.00 167815.37 2041.70 175031.53 00:09:23.962 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x2000 length 0x2000 00:09:23.962 concat0 : 5.23 636.52 2.49 0.00 0.00 193694.68 2394.58 177451.32 00:09:23.962 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x1000 00:09:23.962 raid1 : 5.22 736.18 2.88 0.00 0.00 167613.68 2432.39 183097.50 00:09:23.962 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x1000 length 0x1000 00:09:23.962 raid1 : 5.23 636.28 2.49 0.00 0.00 193352.23 3125.56 187130.49 00:09:23.962 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x0 length 0x4e2 00:09:23.962 AIO0 : 5.22 735.75 2.87 0.00 0.00 167427.64 894.82 191163.47 00:09:23.962 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:23.962 Verification LBA range: start 0x4e2 length 0x4e2 00:09:23.962 AIO0 : 5.24 659.75 2.58 0.00 0.00 185975.47 1140.58 193583.26 00:09:23.962 =================================================================================================================== 00:09:23.962 Total : 22842.65 89.23 0.00 0.00 175511.27 422.20 380713.75 00:09:24.222 00:09:24.222 real 0m6.202s 00:09:24.222 user 0m11.672s 00:09:24.222 sys 0m0.286s 00:09:24.222 07:45:08 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:24.222 07:45:08 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:24.222 ************************************ 00:09:24.222 END TEST bdev_verify 00:09:24.222 ************************************ 00:09:24.222 07:45:08 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:24.222 07:45:08 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:24.222 07:45:08 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:24.222 07:45:08 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:24.222 07:45:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:24.222 ************************************ 00:09:24.222 START TEST bdev_verify_big_io 00:09:24.222 ************************************ 00:09:24.222 07:45:08 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:24.482 [2024-07-15 07:45:09.009093] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:09:24.482 [2024-07-15 07:45:09.009220] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1574129 ] 00:09:24.482 [2024-07-15 07:45:09.149836] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:24.482 [2024-07-15 07:45:09.212489] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:24.482 [2024-07-15 07:45:09.212494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.742 [2024-07-15 07:45:09.335540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:24.742 [2024-07-15 07:45:09.335581] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:24.742 [2024-07-15 07:45:09.335589] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:24.743 [2024-07-15 07:45:09.343549] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:24.743 [2024-07-15 07:45:09.343568] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:24.743 [2024-07-15 07:45:09.351560] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:24.743 [2024-07-15 07:45:09.351577] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:24.743 [2024-07-15 07:45:09.412451] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:24.743 [2024-07-15 07:45:09.412488] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:24.743 [2024-07-15 07:45:09.412498] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bba990 00:09:24.743 [2024-07-15 07:45:09.412505] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:24.743 [2024-07-15 07:45:09.413772] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:24.743 [2024-07-15 07:45:09.413791] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:25.003 [2024-07-15 07:45:09.591487] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.592919] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.594855] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.596159] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.597923] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.599177] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.600970] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.602785] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.604063] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.605892] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.606929] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.608348] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.609325] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.610755] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.611748] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.613199] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:25.003 [2024-07-15 07:45:09.633395] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:25.003 [2024-07-15 07:45:09.634984] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:25.003 Running I/O for 5 seconds... 00:09:33.132 00:09:33.132 Latency(us) 00:09:33.132 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:33.132 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x100 00:09:33.132 Malloc0 : 6.13 125.33 7.83 0.00 0.00 1001874.24 727.83 2581110.15 00:09:33.132 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x100 length 0x100 00:09:33.132 Malloc0 : 6.02 127.56 7.97 0.00 0.00 981113.83 901.12 2658543.46 00:09:33.132 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x80 00:09:33.132 Malloc1p0 : 6.48 75.35 4.71 0.00 0.00 1552835.94 1928.27 3084426.63 00:09:33.132 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x80 length 0x80 00:09:33.132 Malloc1p0 : 7.09 31.59 1.97 0.00 0.00 3600069.42 1468.26 5884931.15 00:09:33.132 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x80 00:09:33.132 Malloc1p1 : 6.86 32.66 2.04 0.00 0.00 3426347.14 1197.29 5704253.44 00:09:33.132 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x80 length 0x80 00:09:33.132 Malloc1p1 : 7.09 31.59 1.97 0.00 0.00 3466334.03 1499.77 5652631.24 00:09:33.132 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x20 00:09:33.132 Malloc2p0 : 6.36 20.13 1.26 0.00 0.00 1385924.49 545.08 2258471.38 00:09:33.132 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x20 length 0x20 00:09:33.132 Malloc2p0 : 6.48 22.22 1.39 0.00 0.00 1259404.66 614.40 2219754.73 00:09:33.132 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x20 00:09:33.132 Malloc2p1 : 6.36 20.13 1.26 0.00 0.00 1372411.41 526.18 2219754.73 00:09:33.132 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x20 length 0x20 00:09:33.132 Malloc2p1 : 6.48 22.21 1.39 0.00 0.00 1246311.72 567.14 2181038.08 00:09:33.132 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x20 00:09:33.132 Malloc2p2 : 6.36 20.12 1.26 0.00 0.00 1358570.57 475.77 2193943.63 00:09:33.132 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x20 length 0x20 00:09:33.132 Malloc2p2 : 6.49 22.20 1.39 0.00 0.00 1233526.80 576.59 2155226.98 00:09:33.132 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x20 00:09:33.132 Malloc2p3 : 6.36 20.12 1.26 0.00 0.00 1345139.56 491.52 2168132.53 00:09:33.132 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x20 length 0x20 00:09:33.132 Malloc2p3 : 6.49 22.20 1.39 0.00 0.00 1220871.54 573.44 2129415.88 00:09:33.132 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x20 00:09:33.132 Malloc2p4 : 6.36 20.12 1.26 0.00 0.00 1331706.65 478.92 2129415.88 00:09:33.132 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x20 length 0x20 00:09:33.132 Malloc2p4 : 6.49 22.19 1.39 0.00 0.00 1207972.07 579.74 2090699.22 00:09:33.132 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x20 00:09:33.132 Malloc2p5 : 6.48 22.23 1.39 0.00 0.00 1209281.74 478.92 2103604.78 00:09:33.132 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x20 length 0x20 00:09:33.132 Malloc2p5 : 6.49 22.19 1.39 0.00 0.00 1194972.56 573.44 2064888.12 00:09:33.132 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x20 00:09:33.132 Malloc2p6 : 6.48 22.23 1.39 0.00 0.00 1197025.12 485.22 2077793.67 00:09:33.132 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x20 length 0x20 00:09:33.132 Malloc2p6 : 6.49 22.19 1.39 0.00 0.00 1181807.93 570.29 2039077.02 00:09:33.132 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x20 00:09:33.132 Malloc2p7 : 6.48 22.22 1.39 0.00 0.00 1185284.69 491.52 2051982.57 00:09:33.132 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x20 length 0x20 00:09:33.132 Malloc2p7 : 6.49 22.18 1.39 0.00 0.00 1168704.43 573.44 2000360.37 00:09:33.132 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x100 00:09:33.132 TestPT : 7.01 32.25 2.02 0.00 0.00 3079890.42 123409.33 4000720.74 00:09:33.132 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x100 length 0x100 00:09:33.132 TestPT : 6.93 32.33 2.02 0.00 0.00 3042861.69 115343.36 3949098.54 00:09:33.132 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x200 00:09:33.132 raid0 : 7.07 36.22 2.26 0.00 0.00 2653343.94 1285.51 4852487.09 00:09:33.132 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x200 length 0x200 00:09:33.132 raid0 : 7.14 38.09 2.38 0.00 0.00 2558084.85 1550.18 4749242.68 00:09:33.132 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x200 00:09:33.132 concat0 : 7.01 43.51 2.72 0.00 0.00 2161707.66 1266.61 4645998.28 00:09:33.132 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x200 length 0x200 00:09:33.132 concat0 : 7.07 46.66 2.92 0.00 0.00 2025066.46 1562.78 4542753.87 00:09:33.132 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x100 00:09:33.132 raid1 : 7.07 56.58 3.54 0.00 0.00 1643596.80 1625.80 4465320.57 00:09:33.132 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x100 length 0x100 00:09:33.132 raid1 : 7.10 50.74 3.17 0.00 0.00 1801022.35 2117.32 4336265.06 00:09:33.132 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x0 length 0x4e 00:09:33.132 AIO0 : 7.07 53.74 3.36 0.00 0.00 1024690.05 412.75 3303821.00 00:09:33.132 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:33.132 Verification LBA range: start 0x4e length 0x4e 00:09:33.132 AIO0 : 7.16 78.09 4.88 0.00 0.00 690259.40 708.92 3071521.08 00:09:33.132 =================================================================================================================== 00:09:33.132 Total : 1237.17 77.32 0.00 0.00 1649415.32 412.75 5884931.15 00:09:33.132 00:09:33.132 real 0m8.201s 00:09:33.132 user 0m15.517s 00:09:33.132 sys 0m0.321s 00:09:33.132 07:45:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:33.132 07:45:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:33.132 ************************************ 00:09:33.132 END TEST bdev_verify_big_io 00:09:33.132 ************************************ 00:09:33.132 07:45:17 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:33.132 07:45:17 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:33.132 07:45:17 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:33.132 07:45:17 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:33.132 07:45:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:33.132 ************************************ 00:09:33.132 START TEST bdev_write_zeroes 00:09:33.132 ************************************ 00:09:33.132 07:45:17 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:33.132 [2024-07-15 07:45:17.248153] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:09:33.132 [2024-07-15 07:45:17.248202] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1576112 ] 00:09:33.132 [2024-07-15 07:45:17.333256] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.132 [2024-07-15 07:45:17.399797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.132 [2024-07-15 07:45:17.521595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:33.132 [2024-07-15 07:45:17.521636] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:33.132 [2024-07-15 07:45:17.521645] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:33.132 [2024-07-15 07:45:17.529601] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:33.132 [2024-07-15 07:45:17.529619] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:33.132 [2024-07-15 07:45:17.537615] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:33.132 [2024-07-15 07:45:17.537630] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:33.132 [2024-07-15 07:45:17.598400] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:33.132 [2024-07-15 07:45:17.598437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:33.133 [2024-07-15 07:45:17.598447] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a10f0 00:09:33.133 [2024-07-15 07:45:17.598454] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:33.133 [2024-07-15 07:45:17.599597] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:33.133 [2024-07-15 07:45:17.599617] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:33.133 Running I/O for 1 seconds... 00:09:34.514 00:09:34.514 Latency(us) 00:09:34.514 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:34.514 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc0 : 1.04 6027.77 23.55 0.00 0.00 21223.90 523.03 35691.91 00:09:34.514 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc1p0 : 1.04 6020.40 23.52 0.00 0.00 21217.33 753.03 34885.32 00:09:34.514 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc1p1 : 1.04 6013.07 23.49 0.00 0.00 21205.71 740.43 34280.37 00:09:34.514 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc2p0 : 1.04 6005.74 23.46 0.00 0.00 21191.77 743.58 33473.77 00:09:34.514 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc2p1 : 1.05 5998.46 23.43 0.00 0.00 21179.32 746.73 32868.82 00:09:34.514 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc2p2 : 1.05 5991.21 23.40 0.00 0.00 21170.19 743.58 32062.23 00:09:34.514 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc2p3 : 1.05 5983.91 23.37 0.00 0.00 21158.11 743.58 31255.63 00:09:34.514 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc2p4 : 1.05 5976.70 23.35 0.00 0.00 21143.99 765.64 30650.68 00:09:34.514 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc2p5 : 1.05 5969.51 23.32 0.00 0.00 21131.55 753.03 29844.09 00:09:34.514 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc2p6 : 1.05 5962.28 23.29 0.00 0.00 21117.73 762.49 29037.49 00:09:34.514 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 Malloc2p7 : 1.05 5955.10 23.26 0.00 0.00 21105.04 743.58 28432.54 00:09:34.514 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 TestPT : 1.05 5947.96 23.23 0.00 0.00 21094.75 787.69 27625.94 00:09:34.514 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 raid0 : 1.06 5939.75 23.20 0.00 0.00 21074.14 1449.35 26214.40 00:09:34.514 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 concat0 : 1.06 5931.66 23.17 0.00 0.00 21031.90 1436.75 24702.03 00:09:34.514 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 raid1 : 1.06 5921.58 23.13 0.00 0.00 20985.38 2230.74 22483.89 00:09:34.514 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:34.514 AIO0 : 1.06 5915.43 23.11 0.00 0.00 20921.99 784.54 21677.29 00:09:34.514 =================================================================================================================== 00:09:34.514 Total : 95560.52 373.28 0.00 0.00 21122.05 523.03 35691.91 00:09:34.514 00:09:34.514 real 0m1.936s 00:09:34.514 user 0m1.617s 00:09:34.514 sys 0m0.212s 00:09:34.514 07:45:19 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:34.514 07:45:19 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:34.514 ************************************ 00:09:34.514 END TEST bdev_write_zeroes 00:09:34.514 ************************************ 00:09:34.514 07:45:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:34.515 07:45:19 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:34.515 07:45:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:34.515 07:45:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:34.515 07:45:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:34.515 ************************************ 00:09:34.515 START TEST bdev_json_nonenclosed 00:09:34.515 ************************************ 00:09:34.515 07:45:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:34.515 [2024-07-15 07:45:19.262044] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:09:34.515 [2024-07-15 07:45:19.262086] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1576443 ] 00:09:34.775 [2024-07-15 07:45:19.347430] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:34.775 [2024-07-15 07:45:19.409271] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:34.775 [2024-07-15 07:45:19.409320] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:34.775 [2024-07-15 07:45:19.409331] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:34.775 [2024-07-15 07:45:19.409337] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:34.775 00:09:34.775 real 0m0.256s 00:09:34.775 user 0m0.165s 00:09:34.775 sys 0m0.090s 00:09:34.775 07:45:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:09:34.775 07:45:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:34.775 07:45:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:34.775 ************************************ 00:09:34.775 END TEST bdev_json_nonenclosed 00:09:34.775 ************************************ 00:09:34.775 07:45:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:34.775 07:45:19 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:09:34.775 07:45:19 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:34.775 07:45:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:34.775 07:45:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:34.775 07:45:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:35.035 ************************************ 00:09:35.035 START TEST bdev_json_nonarray 00:09:35.035 ************************************ 00:09:35.035 07:45:19 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:35.035 [2024-07-15 07:45:19.609706] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:09:35.035 [2024-07-15 07:45:19.609770] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1576465 ] 00:09:35.035 [2024-07-15 07:45:19.695348] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:35.035 [2024-07-15 07:45:19.760082] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.035 [2024-07-15 07:45:19.760140] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:35.035 [2024-07-15 07:45:19.760151] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:35.035 [2024-07-15 07:45:19.760158] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:35.296 00:09:35.296 real 0m0.270s 00:09:35.296 user 0m0.158s 00:09:35.296 sys 0m0.111s 00:09:35.296 07:45:19 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:09:35.296 07:45:19 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:35.296 07:45:19 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:35.296 ************************************ 00:09:35.296 END TEST bdev_json_nonarray 00:09:35.296 ************************************ 00:09:35.296 07:45:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:35.296 07:45:19 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:09:35.296 07:45:19 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:35.296 07:45:19 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:35.296 07:45:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:35.296 07:45:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:35.296 07:45:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:35.296 ************************************ 00:09:35.296 START TEST bdev_qos 00:09:35.296 ************************************ 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=1576493 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 1576493' 00:09:35.296 Process qos testing pid: 1576493 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 1576493 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 1576493 ']' 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:35.296 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:35.296 07:45:19 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:35.296 [2024-07-15 07:45:19.951014] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:09:35.296 [2024-07-15 07:45:19.951060] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1576493 ] 00:09:35.296 [2024-07-15 07:45:20.030462] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:35.556 [2024-07-15 07:45:20.133801] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:36.127 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:36.127 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:09:36.127 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:36.128 Malloc_0 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:36.128 [ 00:09:36.128 { 00:09:36.128 "name": "Malloc_0", 00:09:36.128 "aliases": [ 00:09:36.128 "8ab38a23-637d-48a3-8084-16f35c137f1f" 00:09:36.128 ], 00:09:36.128 "product_name": "Malloc disk", 00:09:36.128 "block_size": 512, 00:09:36.128 "num_blocks": 262144, 00:09:36.128 "uuid": "8ab38a23-637d-48a3-8084-16f35c137f1f", 00:09:36.128 "assigned_rate_limits": { 00:09:36.128 "rw_ios_per_sec": 0, 00:09:36.128 "rw_mbytes_per_sec": 0, 00:09:36.128 "r_mbytes_per_sec": 0, 00:09:36.128 "w_mbytes_per_sec": 0 00:09:36.128 }, 00:09:36.128 "claimed": false, 00:09:36.128 "zoned": false, 00:09:36.128 "supported_io_types": { 00:09:36.128 "read": true, 00:09:36.128 "write": true, 00:09:36.128 "unmap": true, 00:09:36.128 "flush": true, 00:09:36.128 "reset": true, 00:09:36.128 "nvme_admin": false, 00:09:36.128 "nvme_io": false, 00:09:36.128 "nvme_io_md": false, 00:09:36.128 "write_zeroes": true, 00:09:36.128 "zcopy": true, 00:09:36.128 "get_zone_info": false, 00:09:36.128 "zone_management": false, 00:09:36.128 "zone_append": false, 00:09:36.128 "compare": false, 00:09:36.128 "compare_and_write": false, 00:09:36.128 "abort": true, 00:09:36.128 "seek_hole": false, 00:09:36.128 "seek_data": false, 00:09:36.128 "copy": true, 00:09:36.128 "nvme_iov_md": false 00:09:36.128 }, 00:09:36.128 "memory_domains": [ 00:09:36.128 { 00:09:36.128 "dma_device_id": "system", 00:09:36.128 "dma_device_type": 1 00:09:36.128 }, 00:09:36.128 { 00:09:36.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:36.128 "dma_device_type": 2 00:09:36.128 } 00:09:36.128 ], 00:09:36.128 "driver_specific": {} 00:09:36.128 } 00:09:36.128 ] 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:36.128 Null_1 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:36.128 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:36.128 [ 00:09:36.128 { 00:09:36.128 "name": "Null_1", 00:09:36.128 "aliases": [ 00:09:36.128 "ed8011ea-048f-4e9e-80d1-f51321f8014c" 00:09:36.128 ], 00:09:36.128 "product_name": "Null disk", 00:09:36.128 "block_size": 512, 00:09:36.128 "num_blocks": 262144, 00:09:36.128 "uuid": "ed8011ea-048f-4e9e-80d1-f51321f8014c", 00:09:36.128 "assigned_rate_limits": { 00:09:36.128 "rw_ios_per_sec": 0, 00:09:36.128 "rw_mbytes_per_sec": 0, 00:09:36.128 "r_mbytes_per_sec": 0, 00:09:36.128 "w_mbytes_per_sec": 0 00:09:36.128 }, 00:09:36.128 "claimed": false, 00:09:36.389 "zoned": false, 00:09:36.389 "supported_io_types": { 00:09:36.389 "read": true, 00:09:36.389 "write": true, 00:09:36.389 "unmap": false, 00:09:36.389 "flush": false, 00:09:36.389 "reset": true, 00:09:36.389 "nvme_admin": false, 00:09:36.389 "nvme_io": false, 00:09:36.389 "nvme_io_md": false, 00:09:36.389 "write_zeroes": true, 00:09:36.389 "zcopy": false, 00:09:36.389 "get_zone_info": false, 00:09:36.389 "zone_management": false, 00:09:36.389 "zone_append": false, 00:09:36.389 "compare": false, 00:09:36.389 "compare_and_write": false, 00:09:36.389 "abort": true, 00:09:36.389 "seek_hole": false, 00:09:36.389 "seek_data": false, 00:09:36.389 "copy": false, 00:09:36.389 "nvme_iov_md": false 00:09:36.389 }, 00:09:36.389 "driver_specific": {} 00:09:36.389 } 00:09:36.389 ] 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:36.389 07:45:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:36.389 Running I/O for 60 seconds... 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 58616.58 234466.30 0.00 0.00 235520.00 0.00 0.00 ' 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=58616.58 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 58616 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=58616 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=14000 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 14000 -gt 1000 ']' 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 14000 Malloc_0 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 14000 IOPS Malloc_0 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:41.712 07:45:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:41.712 ************************************ 00:09:41.712 START TEST bdev_qos_iops 00:09:41.712 ************************************ 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 14000 IOPS Malloc_0 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=14000 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:41.712 07:45:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:09:47.000 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 14002.78 56011.11 0.00 0.00 57400.00 0.00 0.00 ' 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=14002.78 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 14002 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=14002 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=12600 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=15400 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14002 -lt 12600 ']' 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14002 -gt 15400 ']' 00:09:47.001 00:09:47.001 real 0m5.253s 00:09:47.001 user 0m0.108s 00:09:47.001 sys 0m0.043s 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:47.001 07:45:31 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:09:47.001 ************************************ 00:09:47.001 END TEST bdev_qos_iops 00:09:47.001 ************************************ 00:09:47.001 07:45:31 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:47.001 07:45:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:09:47.001 07:45:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:47.001 07:45:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:47.001 07:45:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:47.001 07:45:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:47.001 07:45:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:47.001 07:45:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20310.57 81242.26 0.00 0.00 82944.00 0.00 0.00 ' 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=82944.00 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 82944 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=82944 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.318 07:45:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:52.318 ************************************ 00:09:52.318 START TEST bdev_qos_bw 00:09:52.318 ************************************ 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:09:52.318 07:45:36 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2047.12 8188.47 0.00 0.00 8372.00 0.00 0.00 ' 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8372.00 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8372 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8372 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8372 -lt 7372 ']' 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8372 -gt 9011 ']' 00:09:57.604 00:09:57.604 real 0m5.272s 00:09:57.604 user 0m0.106s 00:09:57.604 sys 0m0.044s 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:57.604 ************************************ 00:09:57.604 END TEST bdev_qos_bw 00:09:57.604 ************************************ 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:57.604 07:45:42 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:57.604 ************************************ 00:09:57.604 START TEST bdev_qos_ro_bw 00:09:57.604 ************************************ 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:57.604 07:45:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.60 2046.40 0.00 0.00 2060.00 0.00 0.00 ' 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:10:02.898 00:10:02.898 real 0m5.174s 00:10:02.898 user 0m0.107s 00:10:02.898 sys 0m0.042s 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:02.898 07:45:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:02.898 ************************************ 00:10:02.898 END TEST bdev_qos_ro_bw 00:10:02.898 ************************************ 00:10:02.898 07:45:47 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:02.898 07:45:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:02.898 07:45:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.898 07:45:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:03.159 07:45:47 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:03.159 07:45:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:03.159 07:45:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:03.159 07:45:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:03.420 00:10:03.420 Latency(us) 00:10:03.420 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:03.420 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:03.420 Malloc_0 : 26.69 19353.97 75.60 0.00 0.00 13101.39 2104.71 503316.48 00:10:03.420 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:03.420 Null_1 : 26.84 19691.16 76.92 0.00 0.00 12968.06 894.82 154060.01 00:10:03.420 =================================================================================================================== 00:10:03.420 Total : 39045.13 152.52 0.00 0.00 13033.97 894.82 503316.48 00:10:03.420 0 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 1576493 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 1576493 ']' 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 1576493 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1576493 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1576493' 00:10:03.420 killing process with pid 1576493 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 1576493 00:10:03.420 Received shutdown signal, test time was about 26.909631 seconds 00:10:03.420 00:10:03.420 Latency(us) 00:10:03.420 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:03.420 =================================================================================================================== 00:10:03.420 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:03.420 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 1576493 00:10:03.681 07:45:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:03.681 00:10:03.681 real 0m28.358s 00:10:03.681 user 0m29.193s 00:10:03.681 sys 0m0.751s 00:10:03.681 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:03.681 07:45:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:03.681 ************************************ 00:10:03.681 END TEST bdev_qos 00:10:03.681 ************************************ 00:10:03.681 07:45:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:03.681 07:45:48 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:03.681 07:45:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:03.681 07:45:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:03.681 07:45:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:03.681 ************************************ 00:10:03.681 START TEST bdev_qd_sampling 00:10:03.681 ************************************ 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=1581202 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 1581202' 00:10:03.681 Process bdev QD sampling period testing pid: 1581202 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 1581202 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 1581202 ']' 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:03.681 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:03.681 07:45:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:03.681 [2024-07-15 07:45:48.389172] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:03.681 [2024-07-15 07:45:48.389227] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1581202 ] 00:10:03.941 [2024-07-15 07:45:48.467631] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:03.941 [2024-07-15 07:45:48.565219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:03.941 [2024-07-15 07:45:48.565314] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.884 Malloc_QD 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:04.884 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:04.885 [ 00:10:04.885 { 00:10:04.885 "name": "Malloc_QD", 00:10:04.885 "aliases": [ 00:10:04.885 "703be69a-198c-4a9e-a37f-02ca2bf10030" 00:10:04.885 ], 00:10:04.885 "product_name": "Malloc disk", 00:10:04.885 "block_size": 512, 00:10:04.885 "num_blocks": 262144, 00:10:04.885 "uuid": "703be69a-198c-4a9e-a37f-02ca2bf10030", 00:10:04.885 "assigned_rate_limits": { 00:10:04.885 "rw_ios_per_sec": 0, 00:10:04.885 "rw_mbytes_per_sec": 0, 00:10:04.885 "r_mbytes_per_sec": 0, 00:10:04.885 "w_mbytes_per_sec": 0 00:10:04.885 }, 00:10:04.885 "claimed": false, 00:10:04.885 "zoned": false, 00:10:04.885 "supported_io_types": { 00:10:04.885 "read": true, 00:10:04.885 "write": true, 00:10:04.885 "unmap": true, 00:10:04.885 "flush": true, 00:10:04.885 "reset": true, 00:10:04.885 "nvme_admin": false, 00:10:04.885 "nvme_io": false, 00:10:04.885 "nvme_io_md": false, 00:10:04.885 "write_zeroes": true, 00:10:04.885 "zcopy": true, 00:10:04.885 "get_zone_info": false, 00:10:04.885 "zone_management": false, 00:10:04.885 "zone_append": false, 00:10:04.885 "compare": false, 00:10:04.885 "compare_and_write": false, 00:10:04.885 "abort": true, 00:10:04.885 "seek_hole": false, 00:10:04.885 "seek_data": false, 00:10:04.885 "copy": true, 00:10:04.885 "nvme_iov_md": false 00:10:04.885 }, 00:10:04.885 "memory_domains": [ 00:10:04.885 { 00:10:04.885 "dma_device_id": "system", 00:10:04.885 "dma_device_type": 1 00:10:04.885 }, 00:10:04.885 { 00:10:04.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.885 "dma_device_type": 2 00:10:04.885 } 00:10:04.885 ], 00:10:04.885 "driver_specific": {} 00:10:04.885 } 00:10:04.885 ] 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:04.885 07:45:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:04.885 Running I/O for 5 seconds... 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:06.799 "tick_rate": 2600000000, 00:10:06.799 "ticks": 12538146612251567, 00:10:06.799 "bdevs": [ 00:10:06.799 { 00:10:06.799 "name": "Malloc_QD", 00:10:06.799 "bytes_read": 1104196096, 00:10:06.799 "num_read_ops": 269572, 00:10:06.799 "bytes_written": 0, 00:10:06.799 "num_write_ops": 0, 00:10:06.799 "bytes_unmapped": 0, 00:10:06.799 "num_unmap_ops": 0, 00:10:06.799 "bytes_copied": 0, 00:10:06.799 "num_copy_ops": 0, 00:10:06.799 "read_latency_ticks": 2564283990572, 00:10:06.799 "max_read_latency_ticks": 13235264, 00:10:06.799 "min_read_latency_ticks": 310108, 00:10:06.799 "write_latency_ticks": 0, 00:10:06.799 "max_write_latency_ticks": 0, 00:10:06.799 "min_write_latency_ticks": 0, 00:10:06.799 "unmap_latency_ticks": 0, 00:10:06.799 "max_unmap_latency_ticks": 0, 00:10:06.799 "min_unmap_latency_ticks": 0, 00:10:06.799 "copy_latency_ticks": 0, 00:10:06.799 "max_copy_latency_ticks": 0, 00:10:06.799 "min_copy_latency_ticks": 0, 00:10:06.799 "io_error": {}, 00:10:06.799 "queue_depth_polling_period": 10, 00:10:06.799 "queue_depth": 512, 00:10:06.799 "io_time": 60, 00:10:06.799 "weighted_io_time": 30720 00:10:06.799 } 00:10:06.799 ] 00:10:06.799 }' 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:06.799 00:10:06.799 Latency(us) 00:10:06.799 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:06.799 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:06.799 Malloc_QD : 2.01 74447.84 290.81 0.00 0.00 3431.17 1127.98 3856.54 00:10:06.799 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:06.799 Malloc_QD : 2.01 65241.73 254.85 0.00 0.00 3915.07 1046.06 5091.64 00:10:06.799 =================================================================================================================== 00:10:06.799 Total : 139689.57 545.66 0.00 0.00 3657.22 1046.06 5091.64 00:10:06.799 0 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 1581202 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 1581202 ']' 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 1581202 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:06.799 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1581202 00:10:07.060 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:07.060 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:07.060 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1581202' 00:10:07.060 killing process with pid 1581202 00:10:07.060 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 1581202 00:10:07.060 Received shutdown signal, test time was about 2.088688 seconds 00:10:07.060 00:10:07.060 Latency(us) 00:10:07.060 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:07.060 =================================================================================================================== 00:10:07.060 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:07.060 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 1581202 00:10:07.060 07:45:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:07.060 00:10:07.060 real 0m3.380s 00:10:07.060 user 0m6.733s 00:10:07.060 sys 0m0.373s 00:10:07.060 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:07.060 07:45:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:07.060 ************************************ 00:10:07.060 END TEST bdev_qd_sampling 00:10:07.060 ************************************ 00:10:07.060 07:45:51 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:07.060 07:45:51 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:07.060 07:45:51 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:07.060 07:45:51 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:07.060 07:45:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.060 ************************************ 00:10:07.060 START TEST bdev_error 00:10:07.060 ************************************ 00:10:07.060 07:45:51 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:10:07.060 07:45:51 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:07.060 07:45:51 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:07.060 07:45:51 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:07.060 07:45:51 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=1581825 00:10:07.060 07:45:51 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 1581825' 00:10:07.060 Process error testing pid: 1581825 00:10:07.060 07:45:51 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 1581825 00:10:07.060 07:45:51 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:07.060 07:45:51 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1581825 ']' 00:10:07.060 07:45:51 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:07.060 07:45:51 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:07.060 07:45:51 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:07.060 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:07.060 07:45:51 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:07.060 07:45:51 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:07.320 [2024-07-15 07:45:51.841745] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:07.320 [2024-07-15 07:45:51.841803] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1581825 ] 00:10:07.320 [2024-07-15 07:45:51.926053] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:07.320 [2024-07-15 07:45:52.026252] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:08.261 07:45:52 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:08.261 Dev_1 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:08.261 07:45:52 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:08.261 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:08.262 [ 00:10:08.262 { 00:10:08.262 "name": "Dev_1", 00:10:08.262 "aliases": [ 00:10:08.262 "12bc2a23-d2ba-4fe7-8d9e-b7dc6fb180ba" 00:10:08.262 ], 00:10:08.262 "product_name": "Malloc disk", 00:10:08.262 "block_size": 512, 00:10:08.262 "num_blocks": 262144, 00:10:08.262 "uuid": "12bc2a23-d2ba-4fe7-8d9e-b7dc6fb180ba", 00:10:08.262 "assigned_rate_limits": { 00:10:08.262 "rw_ios_per_sec": 0, 00:10:08.262 "rw_mbytes_per_sec": 0, 00:10:08.262 "r_mbytes_per_sec": 0, 00:10:08.262 "w_mbytes_per_sec": 0 00:10:08.262 }, 00:10:08.262 "claimed": false, 00:10:08.262 "zoned": false, 00:10:08.262 "supported_io_types": { 00:10:08.262 "read": true, 00:10:08.262 "write": true, 00:10:08.262 "unmap": true, 00:10:08.262 "flush": true, 00:10:08.262 "reset": true, 00:10:08.262 "nvme_admin": false, 00:10:08.262 "nvme_io": false, 00:10:08.262 "nvme_io_md": false, 00:10:08.262 "write_zeroes": true, 00:10:08.262 "zcopy": true, 00:10:08.262 "get_zone_info": false, 00:10:08.262 "zone_management": false, 00:10:08.262 "zone_append": false, 00:10:08.262 "compare": false, 00:10:08.262 "compare_and_write": false, 00:10:08.262 "abort": true, 00:10:08.262 "seek_hole": false, 00:10:08.262 "seek_data": false, 00:10:08.262 "copy": true, 00:10:08.262 "nvme_iov_md": false 00:10:08.262 }, 00:10:08.262 "memory_domains": [ 00:10:08.262 { 00:10:08.262 "dma_device_id": "system", 00:10:08.262 "dma_device_type": 1 00:10:08.262 }, 00:10:08.262 { 00:10:08.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:08.262 "dma_device_type": 2 00:10:08.262 } 00:10:08.262 ], 00:10:08.262 "driver_specific": {} 00:10:08.262 } 00:10:08.262 ] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:08.262 07:45:52 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:08.262 true 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:08.262 Dev_2 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:08.262 [ 00:10:08.262 { 00:10:08.262 "name": "Dev_2", 00:10:08.262 "aliases": [ 00:10:08.262 "5b37d277-4ece-4047-ab62-d5f245d30201" 00:10:08.262 ], 00:10:08.262 "product_name": "Malloc disk", 00:10:08.262 "block_size": 512, 00:10:08.262 "num_blocks": 262144, 00:10:08.262 "uuid": "5b37d277-4ece-4047-ab62-d5f245d30201", 00:10:08.262 "assigned_rate_limits": { 00:10:08.262 "rw_ios_per_sec": 0, 00:10:08.262 "rw_mbytes_per_sec": 0, 00:10:08.262 "r_mbytes_per_sec": 0, 00:10:08.262 "w_mbytes_per_sec": 0 00:10:08.262 }, 00:10:08.262 "claimed": false, 00:10:08.262 "zoned": false, 00:10:08.262 "supported_io_types": { 00:10:08.262 "read": true, 00:10:08.262 "write": true, 00:10:08.262 "unmap": true, 00:10:08.262 "flush": true, 00:10:08.262 "reset": true, 00:10:08.262 "nvme_admin": false, 00:10:08.262 "nvme_io": false, 00:10:08.262 "nvme_io_md": false, 00:10:08.262 "write_zeroes": true, 00:10:08.262 "zcopy": true, 00:10:08.262 "get_zone_info": false, 00:10:08.262 "zone_management": false, 00:10:08.262 "zone_append": false, 00:10:08.262 "compare": false, 00:10:08.262 "compare_and_write": false, 00:10:08.262 "abort": true, 00:10:08.262 "seek_hole": false, 00:10:08.262 "seek_data": false, 00:10:08.262 "copy": true, 00:10:08.262 "nvme_iov_md": false 00:10:08.262 }, 00:10:08.262 "memory_domains": [ 00:10:08.262 { 00:10:08.262 "dma_device_id": "system", 00:10:08.262 "dma_device_type": 1 00:10:08.262 }, 00:10:08.262 { 00:10:08.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:08.262 "dma_device_type": 2 00:10:08.262 } 00:10:08.262 ], 00:10:08.262 "driver_specific": {} 00:10:08.262 } 00:10:08.262 ] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:08.262 07:45:52 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:08.262 07:45:52 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:08.262 07:45:52 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:08.262 07:45:52 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:08.523 Running I/O for 5 seconds... 00:10:09.511 07:45:53 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 1581825 00:10:09.511 07:45:53 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 1581825' 00:10:09.511 Process is existed as continue on error is set. Pid: 1581825 00:10:09.511 07:45:53 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:09.511 07:45:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:09.511 07:45:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:09.511 07:45:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:09.511 07:45:53 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:09.511 07:45:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:09.511 07:45:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:09.511 07:45:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:09.511 07:45:53 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:09.511 Timeout while waiting for response: 00:10:09.511 00:10:09.511 00:10:13.711 00:10:13.711 Latency(us) 00:10:13.711 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:13.711 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:13.711 EE_Dev_1 : 0.77 35326.60 137.99 6.52 0.00 449.25 152.81 753.03 00:10:13.711 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:13.711 Dev_2 : 5.00 77934.47 304.43 0.00 0.00 201.67 79.95 18450.90 00:10:13.712 =================================================================================================================== 00:10:13.712 Total : 113261.08 442.43 6.52 0.00 217.76 79.95 18450.90 00:10:14.282 07:45:58 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 1581825 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 1581825 ']' 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 1581825 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1581825 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1581825' 00:10:14.282 killing process with pid 1581825 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 1581825 00:10:14.282 Received shutdown signal, test time was about 5.000000 seconds 00:10:14.282 00:10:14.282 Latency(us) 00:10:14.282 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:14.282 =================================================================================================================== 00:10:14.282 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:14.282 07:45:58 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 1581825 00:10:14.541 07:45:59 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=1583043 00:10:14.541 07:45:59 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 1583043' 00:10:14.541 Process error testing pid: 1583043 00:10:14.541 07:45:59 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:14.541 07:45:59 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 1583043 00:10:14.541 07:45:59 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1583043 ']' 00:10:14.541 07:45:59 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:14.541 07:45:59 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:14.541 07:45:59 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:14.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:14.541 07:45:59 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:14.541 07:45:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:14.541 [2024-07-15 07:45:59.221387] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:14.541 [2024-07-15 07:45:59.221454] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583043 ] 00:10:14.801 [2024-07-15 07:45:59.303823] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:14.801 [2024-07-15 07:45:59.403508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:15.373 07:46:00 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.373 Dev_1 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.373 07:46:00 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.373 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.634 [ 00:10:15.634 { 00:10:15.634 "name": "Dev_1", 00:10:15.634 "aliases": [ 00:10:15.634 "a0e0c286-dfba-44a3-8679-a74c5ee1e0b5" 00:10:15.634 ], 00:10:15.634 "product_name": "Malloc disk", 00:10:15.634 "block_size": 512, 00:10:15.634 "num_blocks": 262144, 00:10:15.634 "uuid": "a0e0c286-dfba-44a3-8679-a74c5ee1e0b5", 00:10:15.634 "assigned_rate_limits": { 00:10:15.634 "rw_ios_per_sec": 0, 00:10:15.634 "rw_mbytes_per_sec": 0, 00:10:15.634 "r_mbytes_per_sec": 0, 00:10:15.634 "w_mbytes_per_sec": 0 00:10:15.634 }, 00:10:15.634 "claimed": false, 00:10:15.634 "zoned": false, 00:10:15.634 "supported_io_types": { 00:10:15.634 "read": true, 00:10:15.634 "write": true, 00:10:15.634 "unmap": true, 00:10:15.634 "flush": true, 00:10:15.634 "reset": true, 00:10:15.634 "nvme_admin": false, 00:10:15.634 "nvme_io": false, 00:10:15.634 "nvme_io_md": false, 00:10:15.634 "write_zeroes": true, 00:10:15.634 "zcopy": true, 00:10:15.634 "get_zone_info": false, 00:10:15.634 "zone_management": false, 00:10:15.634 "zone_append": false, 00:10:15.634 "compare": false, 00:10:15.634 "compare_and_write": false, 00:10:15.634 "abort": true, 00:10:15.634 "seek_hole": false, 00:10:15.634 "seek_data": false, 00:10:15.634 "copy": true, 00:10:15.634 "nvme_iov_md": false 00:10:15.634 }, 00:10:15.634 "memory_domains": [ 00:10:15.634 { 00:10:15.634 "dma_device_id": "system", 00:10:15.634 "dma_device_type": 1 00:10:15.634 }, 00:10:15.634 { 00:10:15.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.634 "dma_device_type": 2 00:10:15.634 } 00:10:15.634 ], 00:10:15.634 "driver_specific": {} 00:10:15.634 } 00:10:15.634 ] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:15.634 07:46:00 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.634 true 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.634 Dev_2 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.634 [ 00:10:15.634 { 00:10:15.634 "name": "Dev_2", 00:10:15.634 "aliases": [ 00:10:15.634 "6681b56f-edbb-4c7e-b2a7-227ed2f0df74" 00:10:15.634 ], 00:10:15.634 "product_name": "Malloc disk", 00:10:15.634 "block_size": 512, 00:10:15.634 "num_blocks": 262144, 00:10:15.634 "uuid": "6681b56f-edbb-4c7e-b2a7-227ed2f0df74", 00:10:15.634 "assigned_rate_limits": { 00:10:15.634 "rw_ios_per_sec": 0, 00:10:15.634 "rw_mbytes_per_sec": 0, 00:10:15.634 "r_mbytes_per_sec": 0, 00:10:15.634 "w_mbytes_per_sec": 0 00:10:15.634 }, 00:10:15.634 "claimed": false, 00:10:15.634 "zoned": false, 00:10:15.634 "supported_io_types": { 00:10:15.634 "read": true, 00:10:15.634 "write": true, 00:10:15.634 "unmap": true, 00:10:15.634 "flush": true, 00:10:15.634 "reset": true, 00:10:15.634 "nvme_admin": false, 00:10:15.634 "nvme_io": false, 00:10:15.634 "nvme_io_md": false, 00:10:15.634 "write_zeroes": true, 00:10:15.634 "zcopy": true, 00:10:15.634 "get_zone_info": false, 00:10:15.634 "zone_management": false, 00:10:15.634 "zone_append": false, 00:10:15.634 "compare": false, 00:10:15.634 "compare_and_write": false, 00:10:15.634 "abort": true, 00:10:15.634 "seek_hole": false, 00:10:15.634 "seek_data": false, 00:10:15.634 "copy": true, 00:10:15.634 "nvme_iov_md": false 00:10:15.634 }, 00:10:15.634 "memory_domains": [ 00:10:15.634 { 00:10:15.634 "dma_device_id": "system", 00:10:15.634 "dma_device_type": 1 00:10:15.634 }, 00:10:15.634 { 00:10:15.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.634 "dma_device_type": 2 00:10:15.634 } 00:10:15.634 ], 00:10:15.634 "driver_specific": {} 00:10:15.634 } 00:10:15.634 ] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:15.634 07:46:00 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.634 07:46:00 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 1583043 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 1583043 00:10:15.634 07:46:00 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:15.634 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 1583043 00:10:15.634 Running I/O for 5 seconds... 00:10:15.634 task offset: 90816 on job bdev=EE_Dev_1 fails 00:10:15.634 00:10:15.634 Latency(us) 00:10:15.634 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:15.634 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:15.634 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:15.634 EE_Dev_1 : 0.00 27397.26 107.02 6226.65 0.00 394.21 163.05 708.92 00:10:15.634 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:15.634 Dev_2 : 0.00 17094.02 66.77 0.00 0.00 686.57 143.36 1272.91 00:10:15.634 =================================================================================================================== 00:10:15.634 Total : 44491.28 173.79 6226.65 0.00 552.78 143.36 1272.91 00:10:15.634 [2024-07-15 07:46:00.352480] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:15.634 request: 00:10:15.634 { 00:10:15.634 "method": "perform_tests", 00:10:15.634 "req_id": 1 00:10:15.634 } 00:10:15.634 Got JSON-RPC error response 00:10:15.634 response: 00:10:15.634 { 00:10:15.634 "code": -32603, 00:10:15.634 "message": "bdevperf failed with error Operation not permitted" 00:10:15.634 } 00:10:15.895 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:15.895 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:15.895 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:15.895 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:15.895 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:15.895 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:15.895 00:10:15.895 real 0m8.771s 00:10:15.895 user 0m9.271s 00:10:15.895 sys 0m0.750s 00:10:15.895 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:15.895 07:46:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:15.895 ************************************ 00:10:15.895 END TEST bdev_error 00:10:15.895 ************************************ 00:10:15.895 07:46:00 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:15.895 07:46:00 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:15.895 07:46:00 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:15.895 07:46:00 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:15.895 07:46:00 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:15.895 ************************************ 00:10:15.895 START TEST bdev_stat 00:10:15.895 ************************************ 00:10:15.895 07:46:00 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:15.895 07:46:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:15.895 07:46:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=1583375 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 1583375' 00:10:15.896 Process Bdev IO statistics testing pid: 1583375 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 1583375 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 1583375 ']' 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:15.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:15.896 07:46:00 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:16.156 [2024-07-15 07:46:00.695456] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:16.156 [2024-07-15 07:46:00.695514] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583375 ] 00:10:16.156 [2024-07-15 07:46:00.787554] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:16.156 [2024-07-15 07:46:00.881784] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:16.156 [2024-07-15 07:46:00.881834] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.098 Malloc_STAT 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:17.098 [ 00:10:17.098 { 00:10:17.098 "name": "Malloc_STAT", 00:10:17.098 "aliases": [ 00:10:17.098 "ad7cd53e-452e-46e6-b716-442d6a6045df" 00:10:17.098 ], 00:10:17.098 "product_name": "Malloc disk", 00:10:17.098 "block_size": 512, 00:10:17.098 "num_blocks": 262144, 00:10:17.098 "uuid": "ad7cd53e-452e-46e6-b716-442d6a6045df", 00:10:17.098 "assigned_rate_limits": { 00:10:17.098 "rw_ios_per_sec": 0, 00:10:17.098 "rw_mbytes_per_sec": 0, 00:10:17.098 "r_mbytes_per_sec": 0, 00:10:17.098 "w_mbytes_per_sec": 0 00:10:17.098 }, 00:10:17.098 "claimed": false, 00:10:17.098 "zoned": false, 00:10:17.098 "supported_io_types": { 00:10:17.098 "read": true, 00:10:17.098 "write": true, 00:10:17.098 "unmap": true, 00:10:17.098 "flush": true, 00:10:17.098 "reset": true, 00:10:17.098 "nvme_admin": false, 00:10:17.098 "nvme_io": false, 00:10:17.098 "nvme_io_md": false, 00:10:17.098 "write_zeroes": true, 00:10:17.098 "zcopy": true, 00:10:17.098 "get_zone_info": false, 00:10:17.098 "zone_management": false, 00:10:17.098 "zone_append": false, 00:10:17.098 "compare": false, 00:10:17.098 "compare_and_write": false, 00:10:17.098 "abort": true, 00:10:17.098 "seek_hole": false, 00:10:17.098 "seek_data": false, 00:10:17.098 "copy": true, 00:10:17.098 "nvme_iov_md": false 00:10:17.098 }, 00:10:17.098 "memory_domains": [ 00:10:17.098 { 00:10:17.098 "dma_device_id": "system", 00:10:17.098 "dma_device_type": 1 00:10:17.098 }, 00:10:17.098 { 00:10:17.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:17.098 "dma_device_type": 2 00:10:17.098 } 00:10:17.098 ], 00:10:17.098 "driver_specific": {} 00:10:17.098 } 00:10:17.098 ] 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:17.098 07:46:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:17.098 Running I/O for 10 seconds... 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:19.011 "tick_rate": 2600000000, 00:10:19.011 "ticks": 12538178429794065, 00:10:19.011 "bdevs": [ 00:10:19.011 { 00:10:19.011 "name": "Malloc_STAT", 00:10:19.011 "bytes_read": 1104196096, 00:10:19.011 "num_read_ops": 269572, 00:10:19.011 "bytes_written": 0, 00:10:19.011 "num_write_ops": 0, 00:10:19.011 "bytes_unmapped": 0, 00:10:19.011 "num_unmap_ops": 0, 00:10:19.011 "bytes_copied": 0, 00:10:19.011 "num_copy_ops": 0, 00:10:19.011 "read_latency_ticks": 2541503702014, 00:10:19.011 "max_read_latency_ticks": 12914458, 00:10:19.011 "min_read_latency_ticks": 276058, 00:10:19.011 "write_latency_ticks": 0, 00:10:19.011 "max_write_latency_ticks": 0, 00:10:19.011 "min_write_latency_ticks": 0, 00:10:19.011 "unmap_latency_ticks": 0, 00:10:19.011 "max_unmap_latency_ticks": 0, 00:10:19.011 "min_unmap_latency_ticks": 0, 00:10:19.011 "copy_latency_ticks": 0, 00:10:19.011 "max_copy_latency_ticks": 0, 00:10:19.011 "min_copy_latency_ticks": 0, 00:10:19.011 "io_error": {} 00:10:19.011 } 00:10:19.011 ] 00:10:19.011 }' 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=269572 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:19.011 "tick_rate": 2600000000, 00:10:19.011 "ticks": 12538178608877029, 00:10:19.011 "name": "Malloc_STAT", 00:10:19.011 "channels": [ 00:10:19.011 { 00:10:19.011 "thread_id": 2, 00:10:19.011 "bytes_read": 607125504, 00:10:19.011 "num_read_ops": 148224, 00:10:19.011 "bytes_written": 0, 00:10:19.011 "num_write_ops": 0, 00:10:19.011 "bytes_unmapped": 0, 00:10:19.011 "num_unmap_ops": 0, 00:10:19.011 "bytes_copied": 0, 00:10:19.011 "num_copy_ops": 0, 00:10:19.011 "read_latency_ticks": 1316150747138, 00:10:19.011 "max_read_latency_ticks": 9514078, 00:10:19.011 "min_read_latency_ticks": 6867738, 00:10:19.011 "write_latency_ticks": 0, 00:10:19.011 "max_write_latency_ticks": 0, 00:10:19.011 "min_write_latency_ticks": 0, 00:10:19.011 "unmap_latency_ticks": 0, 00:10:19.011 "max_unmap_latency_ticks": 0, 00:10:19.011 "min_unmap_latency_ticks": 0, 00:10:19.011 "copy_latency_ticks": 0, 00:10:19.011 "max_copy_latency_ticks": 0, 00:10:19.011 "min_copy_latency_ticks": 0 00:10:19.011 }, 00:10:19.011 { 00:10:19.011 "thread_id": 3, 00:10:19.011 "bytes_read": 536870912, 00:10:19.011 "num_read_ops": 131072, 00:10:19.011 "bytes_written": 0, 00:10:19.011 "num_write_ops": 0, 00:10:19.011 "bytes_unmapped": 0, 00:10:19.011 "num_unmap_ops": 0, 00:10:19.011 "bytes_copied": 0, 00:10:19.011 "num_copy_ops": 0, 00:10:19.011 "read_latency_ticks": 1317576681742, 00:10:19.011 "max_read_latency_ticks": 12914458, 00:10:19.011 "min_read_latency_ticks": 7896804, 00:10:19.011 "write_latency_ticks": 0, 00:10:19.011 "max_write_latency_ticks": 0, 00:10:19.011 "min_write_latency_ticks": 0, 00:10:19.011 "unmap_latency_ticks": 0, 00:10:19.011 "max_unmap_latency_ticks": 0, 00:10:19.011 "min_unmap_latency_ticks": 0, 00:10:19.011 "copy_latency_ticks": 0, 00:10:19.011 "max_copy_latency_ticks": 0, 00:10:19.011 "min_copy_latency_ticks": 0 00:10:19.011 } 00:10:19.011 ] 00:10:19.011 }' 00:10:19.011 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=148224 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=148224 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=131072 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=279296 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:19.273 "tick_rate": 2600000000, 00:10:19.273 "ticks": 12538178912361767, 00:10:19.273 "bdevs": [ 00:10:19.273 { 00:10:19.273 "name": "Malloc_STAT", 00:10:19.273 "bytes_read": 1211150848, 00:10:19.273 "num_read_ops": 295684, 00:10:19.273 "bytes_written": 0, 00:10:19.273 "num_write_ops": 0, 00:10:19.273 "bytes_unmapped": 0, 00:10:19.273 "num_unmap_ops": 0, 00:10:19.273 "bytes_copied": 0, 00:10:19.273 "num_copy_ops": 0, 00:10:19.273 "read_latency_ticks": 2788574974418, 00:10:19.273 "max_read_latency_ticks": 12914458, 00:10:19.273 "min_read_latency_ticks": 276058, 00:10:19.273 "write_latency_ticks": 0, 00:10:19.273 "max_write_latency_ticks": 0, 00:10:19.273 "min_write_latency_ticks": 0, 00:10:19.273 "unmap_latency_ticks": 0, 00:10:19.273 "max_unmap_latency_ticks": 0, 00:10:19.273 "min_unmap_latency_ticks": 0, 00:10:19.273 "copy_latency_ticks": 0, 00:10:19.273 "max_copy_latency_ticks": 0, 00:10:19.273 "min_copy_latency_ticks": 0, 00:10:19.273 "io_error": {} 00:10:19.273 } 00:10:19.273 ] 00:10:19.273 }' 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=295684 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 279296 -lt 269572 ']' 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 279296 -gt 295684 ']' 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:19.273 00:10:19.273 Latency(us) 00:10:19.273 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:19.273 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:19.273 Malloc_STAT : 2.17 74808.53 292.22 0.00 0.00 3415.06 1020.85 3680.10 00:10:19.273 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:19.273 Malloc_STAT : 2.17 66109.91 258.24 0.00 0.00 3863.95 1027.15 4990.82 00:10:19.273 =================================================================================================================== 00:10:19.273 Total : 140918.44 550.46 0.00 0.00 3625.77 1020.85 4990.82 00:10:19.273 0 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 1583375 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 1583375 ']' 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 1583375 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:19.273 07:46:03 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1583375 00:10:19.273 07:46:04 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:19.273 07:46:04 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:19.273 07:46:04 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1583375' 00:10:19.273 killing process with pid 1583375 00:10:19.273 07:46:04 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 1583375 00:10:19.273 Received shutdown signal, test time was about 2.250044 seconds 00:10:19.273 00:10:19.273 Latency(us) 00:10:19.273 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:19.273 =================================================================================================================== 00:10:19.273 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:19.273 07:46:04 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 1583375 00:10:19.534 07:46:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:19.534 00:10:19.534 real 0m3.518s 00:10:19.534 user 0m7.112s 00:10:19.534 sys 0m0.399s 00:10:19.534 07:46:04 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:19.534 07:46:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:19.534 ************************************ 00:10:19.534 END TEST bdev_stat 00:10:19.534 ************************************ 00:10:19.534 07:46:04 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:19.534 07:46:04 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:19.534 00:10:19.534 real 1m47.943s 00:10:19.534 user 7m11.161s 00:10:19.534 sys 0m16.175s 00:10:19.534 07:46:04 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:19.534 07:46:04 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:19.534 ************************************ 00:10:19.534 END TEST blockdev_general 00:10:19.534 ************************************ 00:10:19.534 07:46:04 -- common/autotest_common.sh@1142 -- # return 0 00:10:19.534 07:46:04 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:19.534 07:46:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:19.534 07:46:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:19.534 07:46:04 -- common/autotest_common.sh@10 -- # set +x 00:10:19.534 ************************************ 00:10:19.534 START TEST bdev_raid 00:10:19.534 ************************************ 00:10:19.534 07:46:04 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:19.794 * Looking for test storage... 00:10:19.794 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:19.794 07:46:04 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:19.794 07:46:04 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:19.794 07:46:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:19.794 07:46:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:19.794 07:46:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:19.794 ************************************ 00:10:19.794 START TEST raid_function_test_raid0 00:10:19.794 ************************************ 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1584068 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1584068' 00:10:19.794 Process raid pid: 1584068 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1584068 /var/tmp/spdk-raid.sock 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 1584068 ']' 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:19.794 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:19.794 07:46:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:19.794 [2024-07-15 07:46:04.504029] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:19.794 [2024-07-15 07:46:04.504089] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:20.055 [2024-07-15 07:46:04.596838] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:20.055 [2024-07-15 07:46:04.690746] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:20.055 [2024-07-15 07:46:04.749794] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:20.055 [2024-07-15 07:46:04.749823] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:20.625 07:46:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:20.625 07:46:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:20.625 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:20.625 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:20.625 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:20.625 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:20.625 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:20.885 [2024-07-15 07:46:05.590422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:20.885 [2024-07-15 07:46:05.591558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:20.885 [2024-07-15 07:46:05.591616] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25c2820 00:10:20.885 [2024-07-15 07:46:05.591622] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:20.885 [2024-07-15 07:46:05.591864] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25c6170 00:10:20.885 [2024-07-15 07:46:05.591970] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25c2820 00:10:20.885 [2024-07-15 07:46:05.591976] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x25c2820 00:10:20.885 [2024-07-15 07:46:05.592062] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:20.885 Base_1 00:10:20.885 Base_2 00:10:20.885 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:20.885 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:20.885 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:21.145 07:46:05 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:21.405 [2024-07-15 07:46:06.019525] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25c5f80 00:10:21.405 /dev/nbd0 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:21.405 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:21.406 1+0 records in 00:10:21.406 1+0 records out 00:10:21.406 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293373 s, 14.0 MB/s 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.406 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:21.666 { 00:10:21.666 "nbd_device": "/dev/nbd0", 00:10:21.666 "bdev_name": "raid" 00:10:21.666 } 00:10:21.666 ]' 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:21.666 { 00:10:21.666 "nbd_device": "/dev/nbd0", 00:10:21.666 "bdev_name": "raid" 00:10:21.666 } 00:10:21.666 ]' 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:21.666 4096+0 records in 00:10:21.666 4096+0 records out 00:10:21.666 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0282274 s, 74.3 MB/s 00:10:21.666 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:21.926 4096+0 records in 00:10:21.926 4096+0 records out 00:10:21.926 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.224956 s, 9.3 MB/s 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:21.926 128+0 records in 00:10:21.926 128+0 records out 00:10:21.926 65536 bytes (66 kB, 64 KiB) copied, 0.000372695 s, 176 MB/s 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:21.926 2035+0 records in 00:10:21.926 2035+0 records out 00:10:21.926 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00479056 s, 217 MB/s 00:10:21.926 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:22.188 456+0 records in 00:10:22.188 456+0 records out 00:10:22.188 233472 bytes (233 kB, 228 KiB) copied, 0.00113647 s, 205 MB/s 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:22.188 [2024-07-15 07:46:06.928910] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:22.188 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:22.449 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:22.449 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:22.449 07:46:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1584068 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 1584068 ']' 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 1584068 00:10:22.449 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1584068 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1584068' 00:10:22.709 killing process with pid 1584068 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 1584068 00:10:22.709 [2024-07-15 07:46:07.251943] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 1584068 00:10:22.709 [2024-07-15 07:46:07.251988] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:22.709 [2024-07-15 07:46:07.252017] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:22.709 [2024-07-15 07:46:07.252024] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c2820 name raid, state offline 00:10:22.709 [2024-07-15 07:46:07.260911] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:22.709 00:10:22.709 real 0m2.928s 00:10:22.709 user 0m4.060s 00:10:22.709 sys 0m0.875s 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:22.709 07:46:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:22.709 ************************************ 00:10:22.709 END TEST raid_function_test_raid0 00:10:22.709 ************************************ 00:10:22.709 07:46:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:22.709 07:46:07 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:22.709 07:46:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:22.709 07:46:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:22.709 07:46:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:22.709 ************************************ 00:10:22.709 START TEST raid_function_test_concat 00:10:22.710 ************************************ 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1584664 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1584664' 00:10:22.710 Process raid pid: 1584664 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1584664 /var/tmp/spdk-raid.sock 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 1584664 ']' 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:22.710 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:22.710 07:46:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:22.969 [2024-07-15 07:46:07.511281] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:22.969 [2024-07-15 07:46:07.511337] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:22.969 [2024-07-15 07:46:07.601793] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:22.969 [2024-07-15 07:46:07.667988] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:22.969 [2024-07-15 07:46:07.711334] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:22.969 [2024-07-15 07:46:07.711356] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:23.908 [2024-07-15 07:46:08.531319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:23.908 [2024-07-15 07:46:08.532298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:23.908 [2024-07-15 07:46:08.532339] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x107d820 00:10:23.908 [2024-07-15 07:46:08.532348] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:23.908 [2024-07-15 07:46:08.532524] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10811b0 00:10:23.908 [2024-07-15 07:46:08.532611] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x107d820 00:10:23.908 [2024-07-15 07:46:08.532617] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x107d820 00:10:23.908 [2024-07-15 07:46:08.532691] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:23.908 Base_1 00:10:23.908 Base_2 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:23.908 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:24.168 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:24.169 07:46:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:24.738 [2024-07-15 07:46:09.269198] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1080fc0 00:10:24.738 /dev/nbd0 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:24.738 1+0 records in 00:10:24.738 1+0 records out 00:10:24.738 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269312 s, 15.2 MB/s 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:24.738 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:24.998 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:24.998 { 00:10:24.998 "nbd_device": "/dev/nbd0", 00:10:24.998 "bdev_name": "raid" 00:10:24.998 } 00:10:24.998 ]' 00:10:24.998 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:24.998 { 00:10:24.998 "nbd_device": "/dev/nbd0", 00:10:24.998 "bdev_name": "raid" 00:10:24.998 } 00:10:24.999 ]' 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:24.999 4096+0 records in 00:10:24.999 4096+0 records out 00:10:24.999 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0265829 s, 78.9 MB/s 00:10:24.999 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:25.258 4096+0 records in 00:10:25.258 4096+0 records out 00:10:25.258 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.167754 s, 12.5 MB/s 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:25.258 128+0 records in 00:10:25.258 128+0 records out 00:10:25.258 65536 bytes (66 kB, 64 KiB) copied, 0.000362705 s, 181 MB/s 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:25.258 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:25.258 2035+0 records in 00:10:25.258 2035+0 records out 00:10:25.259 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00473062 s, 220 MB/s 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:25.259 456+0 records in 00:10:25.259 456+0 records out 00:10:25.259 233472 bytes (233 kB, 228 KiB) copied, 0.00113924 s, 205 MB/s 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:25.259 07:46:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:25.519 [2024-07-15 07:46:10.090175] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:25.519 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:25.778 07:46:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1584664 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 1584664 ']' 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 1584664 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1584664 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1584664' 00:10:25.779 killing process with pid 1584664 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 1584664 00:10:25.779 [2024-07-15 07:46:10.401185] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:25.779 [2024-07-15 07:46:10.401230] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:25.779 [2024-07-15 07:46:10.401257] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:25.779 [2024-07-15 07:46:10.401263] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x107d820 name raid, state offline 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 1584664 00:10:25.779 [2024-07-15 07:46:10.410384] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:25.779 00:10:25.779 real 0m3.079s 00:10:25.779 user 0m4.460s 00:10:25.779 sys 0m0.816s 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:25.779 07:46:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:25.779 ************************************ 00:10:25.779 END TEST raid_function_test_concat 00:10:25.779 ************************************ 00:10:26.039 07:46:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:26.039 07:46:10 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:26.039 07:46:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:26.039 07:46:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:26.039 07:46:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:26.039 ************************************ 00:10:26.039 START TEST raid0_resize_test 00:10:26.039 ************************************ 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=1585170 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 1585170' 00:10:26.039 Process raid pid: 1585170 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 1585170 /var/tmp/spdk-raid.sock 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 1585170 ']' 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:26.039 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:26.039 07:46:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:26.039 [2024-07-15 07:46:10.654203] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:26.039 [2024-07-15 07:46:10.654259] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:26.039 [2024-07-15 07:46:10.745584] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:26.299 [2024-07-15 07:46:10.821894] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:26.299 [2024-07-15 07:46:10.865887] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:26.299 [2024-07-15 07:46:10.865909] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:26.871 07:46:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:26.871 07:46:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:10:26.871 07:46:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:27.131 Base_1 00:10:27.131 07:46:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:27.131 Base_2 00:10:27.131 07:46:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:27.392 [2024-07-15 07:46:12.038728] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:27.392 [2024-07-15 07:46:12.039803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:27.392 [2024-07-15 07:46:12.039835] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ae39d0 00:10:27.392 [2024-07-15 07:46:12.039840] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:27.392 [2024-07-15 07:46:12.039979] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae3cb0 00:10:27.392 [2024-07-15 07:46:12.040044] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ae39d0 00:10:27.392 [2024-07-15 07:46:12.040049] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1ae39d0 00:10:27.392 [2024-07-15 07:46:12.040116] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:27.392 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:27.654 [2024-07-15 07:46:12.223187] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:27.654 [2024-07-15 07:46:12.223203] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:27.654 true 00:10:27.654 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:27.654 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:27.914 [2024-07-15 07:46:12.415796] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:27.914 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:27.914 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:27.914 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:27.914 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:27.914 [2024-07-15 07:46:12.576051] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:27.914 [2024-07-15 07:46:12.576062] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:27.914 [2024-07-15 07:46:12.576076] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:27.914 true 00:10:27.914 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:27.914 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:28.174 [2024-07-15 07:46:12.772680] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:28.174 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 1585170 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 1585170 ']' 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 1585170 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1585170 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1585170' 00:10:28.175 killing process with pid 1585170 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 1585170 00:10:28.175 [2024-07-15 07:46:12.840530] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:28.175 [2024-07-15 07:46:12.840569] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:28.175 [2024-07-15 07:46:12.840600] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:28.175 [2024-07-15 07:46:12.840605] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ae39d0 name Raid, state offline 00:10:28.175 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 1585170 00:10:28.175 [2024-07-15 07:46:12.841558] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:28.507 07:46:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:28.507 00:10:28.507 real 0m2.356s 00:10:28.507 user 0m3.667s 00:10:28.507 sys 0m0.423s 00:10:28.507 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:28.507 07:46:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.507 ************************************ 00:10:28.507 END TEST raid0_resize_test 00:10:28.507 ************************************ 00:10:28.507 07:46:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:28.507 07:46:12 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:28.507 07:46:12 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:28.507 07:46:12 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:28.507 07:46:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:28.507 07:46:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:28.507 07:46:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:28.507 ************************************ 00:10:28.507 START TEST raid_state_function_test 00:10:28.507 ************************************ 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1585710 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1585710' 00:10:28.507 Process raid pid: 1585710 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1585710 /var/tmp/spdk-raid.sock 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1585710 ']' 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:28.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:28.507 07:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.507 [2024-07-15 07:46:13.091407] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:28.507 [2024-07-15 07:46:13.091455] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:28.507 [2024-07-15 07:46:13.178243] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:28.792 [2024-07-15 07:46:13.241962] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:28.792 [2024-07-15 07:46:13.282649] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:28.792 [2024-07-15 07:46:13.282672] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.731 07:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:29.731 07:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:29.731 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:30.301 [2024-07-15 07:46:14.783556] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:30.301 [2024-07-15 07:46:14.783585] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:30.301 [2024-07-15 07:46:14.783591] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:30.301 [2024-07-15 07:46:14.783596] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.301 07:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:30.301 07:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:30.301 "name": "Existed_Raid", 00:10:30.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.301 "strip_size_kb": 64, 00:10:30.301 "state": "configuring", 00:10:30.301 "raid_level": "raid0", 00:10:30.301 "superblock": false, 00:10:30.301 "num_base_bdevs": 2, 00:10:30.301 "num_base_bdevs_discovered": 0, 00:10:30.301 "num_base_bdevs_operational": 2, 00:10:30.301 "base_bdevs_list": [ 00:10:30.301 { 00:10:30.301 "name": "BaseBdev1", 00:10:30.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.301 "is_configured": false, 00:10:30.301 "data_offset": 0, 00:10:30.301 "data_size": 0 00:10:30.301 }, 00:10:30.301 { 00:10:30.301 "name": "BaseBdev2", 00:10:30.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.301 "is_configured": false, 00:10:30.301 "data_offset": 0, 00:10:30.301 "data_size": 0 00:10:30.301 } 00:10:30.301 ] 00:10:30.301 }' 00:10:30.301 07:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:30.301 07:46:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:30.871 07:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:31.132 [2024-07-15 07:46:15.693754] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:31.132 [2024-07-15 07:46:15.693772] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10076b0 name Existed_Raid, state configuring 00:10:31.132 07:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:31.132 [2024-07-15 07:46:15.878235] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:31.132 [2024-07-15 07:46:15.878251] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:31.132 [2024-07-15 07:46:15.878256] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:31.132 [2024-07-15 07:46:15.878262] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:31.393 07:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:31.393 [2024-07-15 07:46:16.077138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:31.393 BaseBdev1 00:10:31.393 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:31.393 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:31.393 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:31.393 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:31.393 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:31.393 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:31.393 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:31.654 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:31.915 [ 00:10:31.915 { 00:10:31.915 "name": "BaseBdev1", 00:10:31.915 "aliases": [ 00:10:31.915 "492ec6c6-7827-42a0-aac4-7fed5437a448" 00:10:31.915 ], 00:10:31.915 "product_name": "Malloc disk", 00:10:31.915 "block_size": 512, 00:10:31.915 "num_blocks": 65536, 00:10:31.915 "uuid": "492ec6c6-7827-42a0-aac4-7fed5437a448", 00:10:31.915 "assigned_rate_limits": { 00:10:31.915 "rw_ios_per_sec": 0, 00:10:31.915 "rw_mbytes_per_sec": 0, 00:10:31.915 "r_mbytes_per_sec": 0, 00:10:31.915 "w_mbytes_per_sec": 0 00:10:31.915 }, 00:10:31.915 "claimed": true, 00:10:31.915 "claim_type": "exclusive_write", 00:10:31.915 "zoned": false, 00:10:31.915 "supported_io_types": { 00:10:31.915 "read": true, 00:10:31.915 "write": true, 00:10:31.915 "unmap": true, 00:10:31.915 "flush": true, 00:10:31.915 "reset": true, 00:10:31.915 "nvme_admin": false, 00:10:31.915 "nvme_io": false, 00:10:31.915 "nvme_io_md": false, 00:10:31.915 "write_zeroes": true, 00:10:31.915 "zcopy": true, 00:10:31.915 "get_zone_info": false, 00:10:31.915 "zone_management": false, 00:10:31.915 "zone_append": false, 00:10:31.915 "compare": false, 00:10:31.915 "compare_and_write": false, 00:10:31.915 "abort": true, 00:10:31.915 "seek_hole": false, 00:10:31.915 "seek_data": false, 00:10:31.915 "copy": true, 00:10:31.915 "nvme_iov_md": false 00:10:31.915 }, 00:10:31.915 "memory_domains": [ 00:10:31.915 { 00:10:31.915 "dma_device_id": "system", 00:10:31.915 "dma_device_type": 1 00:10:31.915 }, 00:10:31.915 { 00:10:31.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.915 "dma_device_type": 2 00:10:31.915 } 00:10:31.915 ], 00:10:31.915 "driver_specific": {} 00:10:31.915 } 00:10:31.915 ] 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:31.915 "name": "Existed_Raid", 00:10:31.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:31.915 "strip_size_kb": 64, 00:10:31.915 "state": "configuring", 00:10:31.915 "raid_level": "raid0", 00:10:31.915 "superblock": false, 00:10:31.915 "num_base_bdevs": 2, 00:10:31.915 "num_base_bdevs_discovered": 1, 00:10:31.915 "num_base_bdevs_operational": 2, 00:10:31.915 "base_bdevs_list": [ 00:10:31.915 { 00:10:31.915 "name": "BaseBdev1", 00:10:31.915 "uuid": "492ec6c6-7827-42a0-aac4-7fed5437a448", 00:10:31.915 "is_configured": true, 00:10:31.915 "data_offset": 0, 00:10:31.915 "data_size": 65536 00:10:31.915 }, 00:10:31.915 { 00:10:31.915 "name": "BaseBdev2", 00:10:31.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:31.915 "is_configured": false, 00:10:31.915 "data_offset": 0, 00:10:31.915 "data_size": 0 00:10:31.915 } 00:10:31.915 ] 00:10:31.915 }' 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:31.915 07:46:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:32.486 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:32.746 [2024-07-15 07:46:17.348348] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:32.746 [2024-07-15 07:46:17.348372] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1006fa0 name Existed_Raid, state configuring 00:10:32.746 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:33.006 [2024-07-15 07:46:17.540854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:33.007 [2024-07-15 07:46:17.541972] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:33.007 [2024-07-15 07:46:17.541998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:33.007 "name": "Existed_Raid", 00:10:33.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:33.007 "strip_size_kb": 64, 00:10:33.007 "state": "configuring", 00:10:33.007 "raid_level": "raid0", 00:10:33.007 "superblock": false, 00:10:33.007 "num_base_bdevs": 2, 00:10:33.007 "num_base_bdevs_discovered": 1, 00:10:33.007 "num_base_bdevs_operational": 2, 00:10:33.007 "base_bdevs_list": [ 00:10:33.007 { 00:10:33.007 "name": "BaseBdev1", 00:10:33.007 "uuid": "492ec6c6-7827-42a0-aac4-7fed5437a448", 00:10:33.007 "is_configured": true, 00:10:33.007 "data_offset": 0, 00:10:33.007 "data_size": 65536 00:10:33.007 }, 00:10:33.007 { 00:10:33.007 "name": "BaseBdev2", 00:10:33.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:33.007 "is_configured": false, 00:10:33.007 "data_offset": 0, 00:10:33.007 "data_size": 0 00:10:33.007 } 00:10:33.007 ] 00:10:33.007 }' 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:33.007 07:46:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:33.578 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:33.838 [2024-07-15 07:46:18.488093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:33.838 [2024-07-15 07:46:18.488116] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1007d90 00:10:33.838 [2024-07-15 07:46:18.488120] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:33.838 [2024-07-15 07:46:18.488262] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ab730 00:10:33.838 [2024-07-15 07:46:18.488350] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1007d90 00:10:33.838 [2024-07-15 07:46:18.488356] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1007d90 00:10:33.838 [2024-07-15 07:46:18.488475] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:33.838 BaseBdev2 00:10:33.838 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:33.838 07:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:33.838 07:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:33.838 07:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:33.838 07:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:33.838 07:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:33.838 07:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:34.098 07:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:34.359 [ 00:10:34.359 { 00:10:34.359 "name": "BaseBdev2", 00:10:34.359 "aliases": [ 00:10:34.359 "74768788-79f6-4988-80d0-851ca2f6dd28" 00:10:34.359 ], 00:10:34.359 "product_name": "Malloc disk", 00:10:34.359 "block_size": 512, 00:10:34.359 "num_blocks": 65536, 00:10:34.359 "uuid": "74768788-79f6-4988-80d0-851ca2f6dd28", 00:10:34.359 "assigned_rate_limits": { 00:10:34.359 "rw_ios_per_sec": 0, 00:10:34.359 "rw_mbytes_per_sec": 0, 00:10:34.359 "r_mbytes_per_sec": 0, 00:10:34.359 "w_mbytes_per_sec": 0 00:10:34.359 }, 00:10:34.359 "claimed": true, 00:10:34.359 "claim_type": "exclusive_write", 00:10:34.359 "zoned": false, 00:10:34.359 "supported_io_types": { 00:10:34.359 "read": true, 00:10:34.359 "write": true, 00:10:34.359 "unmap": true, 00:10:34.359 "flush": true, 00:10:34.359 "reset": true, 00:10:34.359 "nvme_admin": false, 00:10:34.359 "nvme_io": false, 00:10:34.359 "nvme_io_md": false, 00:10:34.359 "write_zeroes": true, 00:10:34.359 "zcopy": true, 00:10:34.359 "get_zone_info": false, 00:10:34.359 "zone_management": false, 00:10:34.359 "zone_append": false, 00:10:34.359 "compare": false, 00:10:34.359 "compare_and_write": false, 00:10:34.359 "abort": true, 00:10:34.359 "seek_hole": false, 00:10:34.359 "seek_data": false, 00:10:34.359 "copy": true, 00:10:34.359 "nvme_iov_md": false 00:10:34.359 }, 00:10:34.359 "memory_domains": [ 00:10:34.359 { 00:10:34.359 "dma_device_id": "system", 00:10:34.359 "dma_device_type": 1 00:10:34.359 }, 00:10:34.359 { 00:10:34.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:34.359 "dma_device_type": 2 00:10:34.359 } 00:10:34.359 ], 00:10:34.359 "driver_specific": {} 00:10:34.359 } 00:10:34.359 ] 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:34.359 07:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:34.359 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:34.359 "name": "Existed_Raid", 00:10:34.359 "uuid": "36cb2a11-e4f1-4ebb-81d4-a8d7ab59aa77", 00:10:34.359 "strip_size_kb": 64, 00:10:34.359 "state": "online", 00:10:34.359 "raid_level": "raid0", 00:10:34.359 "superblock": false, 00:10:34.359 "num_base_bdevs": 2, 00:10:34.359 "num_base_bdevs_discovered": 2, 00:10:34.359 "num_base_bdevs_operational": 2, 00:10:34.359 "base_bdevs_list": [ 00:10:34.359 { 00:10:34.359 "name": "BaseBdev1", 00:10:34.359 "uuid": "492ec6c6-7827-42a0-aac4-7fed5437a448", 00:10:34.359 "is_configured": true, 00:10:34.359 "data_offset": 0, 00:10:34.359 "data_size": 65536 00:10:34.359 }, 00:10:34.359 { 00:10:34.359 "name": "BaseBdev2", 00:10:34.359 "uuid": "74768788-79f6-4988-80d0-851ca2f6dd28", 00:10:34.359 "is_configured": true, 00:10:34.359 "data_offset": 0, 00:10:34.359 "data_size": 65536 00:10:34.359 } 00:10:34.359 ] 00:10:34.359 }' 00:10:34.359 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:34.359 07:46:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.929 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:34.929 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:34.929 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:34.929 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:34.929 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:34.929 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:34.929 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:34.929 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:35.190 [2024-07-15 07:46:19.763546] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:35.190 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:35.190 "name": "Existed_Raid", 00:10:35.190 "aliases": [ 00:10:35.190 "36cb2a11-e4f1-4ebb-81d4-a8d7ab59aa77" 00:10:35.190 ], 00:10:35.190 "product_name": "Raid Volume", 00:10:35.190 "block_size": 512, 00:10:35.190 "num_blocks": 131072, 00:10:35.190 "uuid": "36cb2a11-e4f1-4ebb-81d4-a8d7ab59aa77", 00:10:35.190 "assigned_rate_limits": { 00:10:35.190 "rw_ios_per_sec": 0, 00:10:35.190 "rw_mbytes_per_sec": 0, 00:10:35.190 "r_mbytes_per_sec": 0, 00:10:35.190 "w_mbytes_per_sec": 0 00:10:35.190 }, 00:10:35.190 "claimed": false, 00:10:35.190 "zoned": false, 00:10:35.190 "supported_io_types": { 00:10:35.190 "read": true, 00:10:35.190 "write": true, 00:10:35.190 "unmap": true, 00:10:35.190 "flush": true, 00:10:35.190 "reset": true, 00:10:35.190 "nvme_admin": false, 00:10:35.190 "nvme_io": false, 00:10:35.190 "nvme_io_md": false, 00:10:35.190 "write_zeroes": true, 00:10:35.190 "zcopy": false, 00:10:35.190 "get_zone_info": false, 00:10:35.190 "zone_management": false, 00:10:35.190 "zone_append": false, 00:10:35.190 "compare": false, 00:10:35.190 "compare_and_write": false, 00:10:35.190 "abort": false, 00:10:35.190 "seek_hole": false, 00:10:35.190 "seek_data": false, 00:10:35.190 "copy": false, 00:10:35.190 "nvme_iov_md": false 00:10:35.190 }, 00:10:35.190 "memory_domains": [ 00:10:35.190 { 00:10:35.190 "dma_device_id": "system", 00:10:35.190 "dma_device_type": 1 00:10:35.190 }, 00:10:35.190 { 00:10:35.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.190 "dma_device_type": 2 00:10:35.190 }, 00:10:35.190 { 00:10:35.190 "dma_device_id": "system", 00:10:35.190 "dma_device_type": 1 00:10:35.190 }, 00:10:35.190 { 00:10:35.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.190 "dma_device_type": 2 00:10:35.190 } 00:10:35.190 ], 00:10:35.190 "driver_specific": { 00:10:35.190 "raid": { 00:10:35.190 "uuid": "36cb2a11-e4f1-4ebb-81d4-a8d7ab59aa77", 00:10:35.190 "strip_size_kb": 64, 00:10:35.190 "state": "online", 00:10:35.190 "raid_level": "raid0", 00:10:35.190 "superblock": false, 00:10:35.190 "num_base_bdevs": 2, 00:10:35.190 "num_base_bdevs_discovered": 2, 00:10:35.190 "num_base_bdevs_operational": 2, 00:10:35.190 "base_bdevs_list": [ 00:10:35.190 { 00:10:35.190 "name": "BaseBdev1", 00:10:35.190 "uuid": "492ec6c6-7827-42a0-aac4-7fed5437a448", 00:10:35.190 "is_configured": true, 00:10:35.190 "data_offset": 0, 00:10:35.190 "data_size": 65536 00:10:35.190 }, 00:10:35.190 { 00:10:35.190 "name": "BaseBdev2", 00:10:35.190 "uuid": "74768788-79f6-4988-80d0-851ca2f6dd28", 00:10:35.190 "is_configured": true, 00:10:35.190 "data_offset": 0, 00:10:35.190 "data_size": 65536 00:10:35.190 } 00:10:35.190 ] 00:10:35.190 } 00:10:35.190 } 00:10:35.190 }' 00:10:35.190 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:35.190 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:35.190 BaseBdev2' 00:10:35.190 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:35.190 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:35.190 07:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:35.451 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:35.451 "name": "BaseBdev1", 00:10:35.451 "aliases": [ 00:10:35.451 "492ec6c6-7827-42a0-aac4-7fed5437a448" 00:10:35.451 ], 00:10:35.451 "product_name": "Malloc disk", 00:10:35.451 "block_size": 512, 00:10:35.451 "num_blocks": 65536, 00:10:35.451 "uuid": "492ec6c6-7827-42a0-aac4-7fed5437a448", 00:10:35.451 "assigned_rate_limits": { 00:10:35.451 "rw_ios_per_sec": 0, 00:10:35.451 "rw_mbytes_per_sec": 0, 00:10:35.451 "r_mbytes_per_sec": 0, 00:10:35.451 "w_mbytes_per_sec": 0 00:10:35.451 }, 00:10:35.451 "claimed": true, 00:10:35.451 "claim_type": "exclusive_write", 00:10:35.451 "zoned": false, 00:10:35.451 "supported_io_types": { 00:10:35.451 "read": true, 00:10:35.451 "write": true, 00:10:35.451 "unmap": true, 00:10:35.451 "flush": true, 00:10:35.451 "reset": true, 00:10:35.451 "nvme_admin": false, 00:10:35.451 "nvme_io": false, 00:10:35.451 "nvme_io_md": false, 00:10:35.451 "write_zeroes": true, 00:10:35.451 "zcopy": true, 00:10:35.451 "get_zone_info": false, 00:10:35.451 "zone_management": false, 00:10:35.451 "zone_append": false, 00:10:35.451 "compare": false, 00:10:35.451 "compare_and_write": false, 00:10:35.451 "abort": true, 00:10:35.451 "seek_hole": false, 00:10:35.451 "seek_data": false, 00:10:35.451 "copy": true, 00:10:35.451 "nvme_iov_md": false 00:10:35.451 }, 00:10:35.451 "memory_domains": [ 00:10:35.451 { 00:10:35.451 "dma_device_id": "system", 00:10:35.451 "dma_device_type": 1 00:10:35.451 }, 00:10:35.451 { 00:10:35.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.451 "dma_device_type": 2 00:10:35.451 } 00:10:35.451 ], 00:10:35.451 "driver_specific": {} 00:10:35.451 }' 00:10:35.451 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.451 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.451 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:35.451 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.451 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.451 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:35.451 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:35.713 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:35.713 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:35.713 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:35.713 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:35.713 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:35.713 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:35.713 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:35.713 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:35.974 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:35.974 "name": "BaseBdev2", 00:10:35.974 "aliases": [ 00:10:35.974 "74768788-79f6-4988-80d0-851ca2f6dd28" 00:10:35.974 ], 00:10:35.974 "product_name": "Malloc disk", 00:10:35.974 "block_size": 512, 00:10:35.974 "num_blocks": 65536, 00:10:35.974 "uuid": "74768788-79f6-4988-80d0-851ca2f6dd28", 00:10:35.974 "assigned_rate_limits": { 00:10:35.974 "rw_ios_per_sec": 0, 00:10:35.974 "rw_mbytes_per_sec": 0, 00:10:35.974 "r_mbytes_per_sec": 0, 00:10:35.974 "w_mbytes_per_sec": 0 00:10:35.974 }, 00:10:35.974 "claimed": true, 00:10:35.974 "claim_type": "exclusive_write", 00:10:35.974 "zoned": false, 00:10:35.974 "supported_io_types": { 00:10:35.974 "read": true, 00:10:35.974 "write": true, 00:10:35.974 "unmap": true, 00:10:35.974 "flush": true, 00:10:35.974 "reset": true, 00:10:35.974 "nvme_admin": false, 00:10:35.974 "nvme_io": false, 00:10:35.974 "nvme_io_md": false, 00:10:35.974 "write_zeroes": true, 00:10:35.974 "zcopy": true, 00:10:35.974 "get_zone_info": false, 00:10:35.974 "zone_management": false, 00:10:35.974 "zone_append": false, 00:10:35.974 "compare": false, 00:10:35.974 "compare_and_write": false, 00:10:35.974 "abort": true, 00:10:35.974 "seek_hole": false, 00:10:35.974 "seek_data": false, 00:10:35.974 "copy": true, 00:10:35.974 "nvme_iov_md": false 00:10:35.974 }, 00:10:35.974 "memory_domains": [ 00:10:35.974 { 00:10:35.974 "dma_device_id": "system", 00:10:35.974 "dma_device_type": 1 00:10:35.974 }, 00:10:35.974 { 00:10:35.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.974 "dma_device_type": 2 00:10:35.974 } 00:10:35.974 ], 00:10:35.974 "driver_specific": {} 00:10:35.974 }' 00:10:35.974 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.974 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:35.974 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:35.974 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:35.974 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:36.235 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:36.235 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:36.235 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:36.235 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:36.235 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:36.235 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:36.235 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:36.235 07:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:36.495 [2024-07-15 07:46:21.090735] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:36.495 [2024-07-15 07:46:21.090752] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:36.495 [2024-07-15 07:46:21.090781] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.495 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:36.755 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:36.755 "name": "Existed_Raid", 00:10:36.755 "uuid": "36cb2a11-e4f1-4ebb-81d4-a8d7ab59aa77", 00:10:36.755 "strip_size_kb": 64, 00:10:36.755 "state": "offline", 00:10:36.755 "raid_level": "raid0", 00:10:36.755 "superblock": false, 00:10:36.755 "num_base_bdevs": 2, 00:10:36.755 "num_base_bdevs_discovered": 1, 00:10:36.755 "num_base_bdevs_operational": 1, 00:10:36.755 "base_bdevs_list": [ 00:10:36.755 { 00:10:36.755 "name": null, 00:10:36.755 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:36.755 "is_configured": false, 00:10:36.755 "data_offset": 0, 00:10:36.755 "data_size": 65536 00:10:36.755 }, 00:10:36.755 { 00:10:36.755 "name": "BaseBdev2", 00:10:36.755 "uuid": "74768788-79f6-4988-80d0-851ca2f6dd28", 00:10:36.755 "is_configured": true, 00:10:36.755 "data_offset": 0, 00:10:36.755 "data_size": 65536 00:10:36.755 } 00:10:36.755 ] 00:10:36.755 }' 00:10:36.755 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:36.755 07:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:37.323 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:37.323 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:37.323 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.323 07:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:37.323 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:37.323 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:37.323 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:37.583 [2024-07-15 07:46:22.201559] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:37.583 [2024-07-15 07:46:22.201598] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1007d90 name Existed_Raid, state offline 00:10:37.583 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:37.583 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:37.583 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.583 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1585710 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1585710 ']' 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1585710 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1585710 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1585710' 00:10:37.844 killing process with pid 1585710 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1585710 00:10:37.844 [2024-07-15 07:46:22.466795] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1585710 00:10:37.844 [2024-07-15 07:46:22.467404] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:37.844 00:10:37.844 real 0m9.555s 00:10:37.844 user 0m17.415s 00:10:37.844 sys 0m1.442s 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:37.844 07:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:37.844 ************************************ 00:10:37.844 END TEST raid_state_function_test 00:10:37.844 ************************************ 00:10:38.105 07:46:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:38.105 07:46:22 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:38.105 07:46:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:38.105 07:46:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:38.105 07:46:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:38.105 ************************************ 00:10:38.105 START TEST raid_state_function_test_sb 00:10:38.105 ************************************ 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1587561 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1587561' 00:10:38.105 Process raid pid: 1587561 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1587561 /var/tmp/spdk-raid.sock 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1587561 ']' 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:38.105 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:38.105 07:46:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:38.105 [2024-07-15 07:46:22.716763] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:38.105 [2024-07-15 07:46:22.716807] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:38.105 [2024-07-15 07:46:22.804029] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:38.366 [2024-07-15 07:46:22.891960] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:38.366 [2024-07-15 07:46:22.936785] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:38.366 [2024-07-15 07:46:22.936811] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:38.944 07:46:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:38.944 07:46:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:38.944 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:39.204 [2024-07-15 07:46:23.727985] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:39.204 [2024-07-15 07:46:23.728028] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:39.204 [2024-07-15 07:46:23.728035] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:39.204 [2024-07-15 07:46:23.728041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:39.204 "name": "Existed_Raid", 00:10:39.204 "uuid": "1d7d1272-80eb-49f7-994b-fdab6cf6f6f5", 00:10:39.204 "strip_size_kb": 64, 00:10:39.204 "state": "configuring", 00:10:39.204 "raid_level": "raid0", 00:10:39.204 "superblock": true, 00:10:39.204 "num_base_bdevs": 2, 00:10:39.204 "num_base_bdevs_discovered": 0, 00:10:39.204 "num_base_bdevs_operational": 2, 00:10:39.204 "base_bdevs_list": [ 00:10:39.204 { 00:10:39.204 "name": "BaseBdev1", 00:10:39.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:39.204 "is_configured": false, 00:10:39.204 "data_offset": 0, 00:10:39.204 "data_size": 0 00:10:39.204 }, 00:10:39.204 { 00:10:39.204 "name": "BaseBdev2", 00:10:39.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:39.204 "is_configured": false, 00:10:39.204 "data_offset": 0, 00:10:39.204 "data_size": 0 00:10:39.204 } 00:10:39.204 ] 00:10:39.204 }' 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:39.204 07:46:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:39.775 07:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:40.035 [2024-07-15 07:46:24.690284] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:40.036 [2024-07-15 07:46:24.690307] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbce6b0 name Existed_Raid, state configuring 00:10:40.036 07:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:40.295 [2024-07-15 07:46:24.874779] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:40.295 [2024-07-15 07:46:24.874799] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:40.295 [2024-07-15 07:46:24.874805] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:40.295 [2024-07-15 07:46:24.874811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:40.295 07:46:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:40.555 [2024-07-15 07:46:25.065211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:40.555 BaseBdev1 00:10:40.555 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:40.555 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:40.555 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:40.555 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:40.555 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:40.555 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:40.555 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:40.555 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:40.815 [ 00:10:40.815 { 00:10:40.815 "name": "BaseBdev1", 00:10:40.815 "aliases": [ 00:10:40.815 "45f61258-6678-4c95-be83-dcc82bfa0487" 00:10:40.815 ], 00:10:40.815 "product_name": "Malloc disk", 00:10:40.815 "block_size": 512, 00:10:40.815 "num_blocks": 65536, 00:10:40.815 "uuid": "45f61258-6678-4c95-be83-dcc82bfa0487", 00:10:40.815 "assigned_rate_limits": { 00:10:40.815 "rw_ios_per_sec": 0, 00:10:40.815 "rw_mbytes_per_sec": 0, 00:10:40.815 "r_mbytes_per_sec": 0, 00:10:40.815 "w_mbytes_per_sec": 0 00:10:40.815 }, 00:10:40.815 "claimed": true, 00:10:40.815 "claim_type": "exclusive_write", 00:10:40.815 "zoned": false, 00:10:40.815 "supported_io_types": { 00:10:40.815 "read": true, 00:10:40.815 "write": true, 00:10:40.815 "unmap": true, 00:10:40.815 "flush": true, 00:10:40.815 "reset": true, 00:10:40.815 "nvme_admin": false, 00:10:40.815 "nvme_io": false, 00:10:40.815 "nvme_io_md": false, 00:10:40.815 "write_zeroes": true, 00:10:40.815 "zcopy": true, 00:10:40.815 "get_zone_info": false, 00:10:40.815 "zone_management": false, 00:10:40.815 "zone_append": false, 00:10:40.815 "compare": false, 00:10:40.815 "compare_and_write": false, 00:10:40.815 "abort": true, 00:10:40.815 "seek_hole": false, 00:10:40.815 "seek_data": false, 00:10:40.815 "copy": true, 00:10:40.815 "nvme_iov_md": false 00:10:40.815 }, 00:10:40.815 "memory_domains": [ 00:10:40.815 { 00:10:40.815 "dma_device_id": "system", 00:10:40.815 "dma_device_type": 1 00:10:40.815 }, 00:10:40.815 { 00:10:40.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.815 "dma_device_type": 2 00:10:40.815 } 00:10:40.815 ], 00:10:40.815 "driver_specific": {} 00:10:40.815 } 00:10:40.815 ] 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.815 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:41.075 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:41.075 "name": "Existed_Raid", 00:10:41.075 "uuid": "515590d8-293d-49cb-a99b-45e295649628", 00:10:41.075 "strip_size_kb": 64, 00:10:41.075 "state": "configuring", 00:10:41.075 "raid_level": "raid0", 00:10:41.075 "superblock": true, 00:10:41.075 "num_base_bdevs": 2, 00:10:41.075 "num_base_bdevs_discovered": 1, 00:10:41.075 "num_base_bdevs_operational": 2, 00:10:41.075 "base_bdevs_list": [ 00:10:41.075 { 00:10:41.075 "name": "BaseBdev1", 00:10:41.075 "uuid": "45f61258-6678-4c95-be83-dcc82bfa0487", 00:10:41.075 "is_configured": true, 00:10:41.075 "data_offset": 2048, 00:10:41.075 "data_size": 63488 00:10:41.075 }, 00:10:41.075 { 00:10:41.075 "name": "BaseBdev2", 00:10:41.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:41.075 "is_configured": false, 00:10:41.075 "data_offset": 0, 00:10:41.075 "data_size": 0 00:10:41.075 } 00:10:41.075 ] 00:10:41.075 }' 00:10:41.075 07:46:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:41.075 07:46:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:41.645 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:41.645 [2024-07-15 07:46:26.340409] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:41.645 [2024-07-15 07:46:26.340433] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbcdfa0 name Existed_Raid, state configuring 00:10:41.645 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:41.905 [2024-07-15 07:46:26.544947] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:41.905 [2024-07-15 07:46:26.546057] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:41.905 [2024-07-15 07:46:26.546078] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:41.905 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:42.164 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:42.164 "name": "Existed_Raid", 00:10:42.164 "uuid": "256343f0-9ae8-4952-92fc-ca8502215147", 00:10:42.164 "strip_size_kb": 64, 00:10:42.164 "state": "configuring", 00:10:42.164 "raid_level": "raid0", 00:10:42.164 "superblock": true, 00:10:42.164 "num_base_bdevs": 2, 00:10:42.164 "num_base_bdevs_discovered": 1, 00:10:42.164 "num_base_bdevs_operational": 2, 00:10:42.164 "base_bdevs_list": [ 00:10:42.164 { 00:10:42.164 "name": "BaseBdev1", 00:10:42.164 "uuid": "45f61258-6678-4c95-be83-dcc82bfa0487", 00:10:42.164 "is_configured": true, 00:10:42.164 "data_offset": 2048, 00:10:42.164 "data_size": 63488 00:10:42.164 }, 00:10:42.164 { 00:10:42.164 "name": "BaseBdev2", 00:10:42.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:42.164 "is_configured": false, 00:10:42.164 "data_offset": 0, 00:10:42.164 "data_size": 0 00:10:42.164 } 00:10:42.164 ] 00:10:42.164 }' 00:10:42.164 07:46:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:42.164 07:46:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:42.867 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:42.867 [2024-07-15 07:46:27.476129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:42.867 [2024-07-15 07:46:27.476232] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbced90 00:10:42.867 [2024-07-15 07:46:27.476240] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:42.867 [2024-07-15 07:46:27.476374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd82750 00:10:42.867 [2024-07-15 07:46:27.476461] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbced90 00:10:42.867 [2024-07-15 07:46:27.476467] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbced90 00:10:42.867 [2024-07-15 07:46:27.476533] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:42.867 BaseBdev2 00:10:42.867 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:42.867 07:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:42.867 07:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:42.867 07:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:42.867 07:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:42.867 07:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:42.867 07:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:43.185 [ 00:10:43.185 { 00:10:43.185 "name": "BaseBdev2", 00:10:43.185 "aliases": [ 00:10:43.185 "35c96ea2-1314-4403-90f6-f510db61dc5b" 00:10:43.185 ], 00:10:43.185 "product_name": "Malloc disk", 00:10:43.185 "block_size": 512, 00:10:43.185 "num_blocks": 65536, 00:10:43.185 "uuid": "35c96ea2-1314-4403-90f6-f510db61dc5b", 00:10:43.185 "assigned_rate_limits": { 00:10:43.185 "rw_ios_per_sec": 0, 00:10:43.185 "rw_mbytes_per_sec": 0, 00:10:43.185 "r_mbytes_per_sec": 0, 00:10:43.185 "w_mbytes_per_sec": 0 00:10:43.185 }, 00:10:43.185 "claimed": true, 00:10:43.185 "claim_type": "exclusive_write", 00:10:43.185 "zoned": false, 00:10:43.185 "supported_io_types": { 00:10:43.185 "read": true, 00:10:43.185 "write": true, 00:10:43.185 "unmap": true, 00:10:43.185 "flush": true, 00:10:43.185 "reset": true, 00:10:43.185 "nvme_admin": false, 00:10:43.185 "nvme_io": false, 00:10:43.185 "nvme_io_md": false, 00:10:43.185 "write_zeroes": true, 00:10:43.185 "zcopy": true, 00:10:43.185 "get_zone_info": false, 00:10:43.185 "zone_management": false, 00:10:43.185 "zone_append": false, 00:10:43.185 "compare": false, 00:10:43.185 "compare_and_write": false, 00:10:43.185 "abort": true, 00:10:43.185 "seek_hole": false, 00:10:43.185 "seek_data": false, 00:10:43.185 "copy": true, 00:10:43.185 "nvme_iov_md": false 00:10:43.185 }, 00:10:43.185 "memory_domains": [ 00:10:43.185 { 00:10:43.185 "dma_device_id": "system", 00:10:43.185 "dma_device_type": 1 00:10:43.185 }, 00:10:43.185 { 00:10:43.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.185 "dma_device_type": 2 00:10:43.185 } 00:10:43.185 ], 00:10:43.185 "driver_specific": {} 00:10:43.185 } 00:10:43.185 ] 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.185 07:46:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:43.444 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:43.444 "name": "Existed_Raid", 00:10:43.444 "uuid": "256343f0-9ae8-4952-92fc-ca8502215147", 00:10:43.444 "strip_size_kb": 64, 00:10:43.444 "state": "online", 00:10:43.444 "raid_level": "raid0", 00:10:43.444 "superblock": true, 00:10:43.444 "num_base_bdevs": 2, 00:10:43.444 "num_base_bdevs_discovered": 2, 00:10:43.444 "num_base_bdevs_operational": 2, 00:10:43.444 "base_bdevs_list": [ 00:10:43.444 { 00:10:43.444 "name": "BaseBdev1", 00:10:43.445 "uuid": "45f61258-6678-4c95-be83-dcc82bfa0487", 00:10:43.445 "is_configured": true, 00:10:43.445 "data_offset": 2048, 00:10:43.445 "data_size": 63488 00:10:43.445 }, 00:10:43.445 { 00:10:43.445 "name": "BaseBdev2", 00:10:43.445 "uuid": "35c96ea2-1314-4403-90f6-f510db61dc5b", 00:10:43.445 "is_configured": true, 00:10:43.445 "data_offset": 2048, 00:10:43.445 "data_size": 63488 00:10:43.445 } 00:10:43.445 ] 00:10:43.445 }' 00:10:43.445 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:43.445 07:46:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:44.014 [2024-07-15 07:46:28.723489] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:44.014 "name": "Existed_Raid", 00:10:44.014 "aliases": [ 00:10:44.014 "256343f0-9ae8-4952-92fc-ca8502215147" 00:10:44.014 ], 00:10:44.014 "product_name": "Raid Volume", 00:10:44.014 "block_size": 512, 00:10:44.014 "num_blocks": 126976, 00:10:44.014 "uuid": "256343f0-9ae8-4952-92fc-ca8502215147", 00:10:44.014 "assigned_rate_limits": { 00:10:44.014 "rw_ios_per_sec": 0, 00:10:44.014 "rw_mbytes_per_sec": 0, 00:10:44.014 "r_mbytes_per_sec": 0, 00:10:44.014 "w_mbytes_per_sec": 0 00:10:44.014 }, 00:10:44.014 "claimed": false, 00:10:44.014 "zoned": false, 00:10:44.014 "supported_io_types": { 00:10:44.014 "read": true, 00:10:44.014 "write": true, 00:10:44.014 "unmap": true, 00:10:44.014 "flush": true, 00:10:44.014 "reset": true, 00:10:44.014 "nvme_admin": false, 00:10:44.014 "nvme_io": false, 00:10:44.014 "nvme_io_md": false, 00:10:44.014 "write_zeroes": true, 00:10:44.014 "zcopy": false, 00:10:44.014 "get_zone_info": false, 00:10:44.014 "zone_management": false, 00:10:44.014 "zone_append": false, 00:10:44.014 "compare": false, 00:10:44.014 "compare_and_write": false, 00:10:44.014 "abort": false, 00:10:44.014 "seek_hole": false, 00:10:44.014 "seek_data": false, 00:10:44.014 "copy": false, 00:10:44.014 "nvme_iov_md": false 00:10:44.014 }, 00:10:44.014 "memory_domains": [ 00:10:44.014 { 00:10:44.014 "dma_device_id": "system", 00:10:44.014 "dma_device_type": 1 00:10:44.014 }, 00:10:44.014 { 00:10:44.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.014 "dma_device_type": 2 00:10:44.014 }, 00:10:44.014 { 00:10:44.014 "dma_device_id": "system", 00:10:44.014 "dma_device_type": 1 00:10:44.014 }, 00:10:44.014 { 00:10:44.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.014 "dma_device_type": 2 00:10:44.014 } 00:10:44.014 ], 00:10:44.014 "driver_specific": { 00:10:44.014 "raid": { 00:10:44.014 "uuid": "256343f0-9ae8-4952-92fc-ca8502215147", 00:10:44.014 "strip_size_kb": 64, 00:10:44.014 "state": "online", 00:10:44.014 "raid_level": "raid0", 00:10:44.014 "superblock": true, 00:10:44.014 "num_base_bdevs": 2, 00:10:44.014 "num_base_bdevs_discovered": 2, 00:10:44.014 "num_base_bdevs_operational": 2, 00:10:44.014 "base_bdevs_list": [ 00:10:44.014 { 00:10:44.014 "name": "BaseBdev1", 00:10:44.014 "uuid": "45f61258-6678-4c95-be83-dcc82bfa0487", 00:10:44.014 "is_configured": true, 00:10:44.014 "data_offset": 2048, 00:10:44.014 "data_size": 63488 00:10:44.014 }, 00:10:44.014 { 00:10:44.014 "name": "BaseBdev2", 00:10:44.014 "uuid": "35c96ea2-1314-4403-90f6-f510db61dc5b", 00:10:44.014 "is_configured": true, 00:10:44.014 "data_offset": 2048, 00:10:44.014 "data_size": 63488 00:10:44.014 } 00:10:44.014 ] 00:10:44.014 } 00:10:44.014 } 00:10:44.014 }' 00:10:44.014 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:44.274 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:44.274 BaseBdev2' 00:10:44.274 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:44.274 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:44.274 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:44.274 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:44.274 "name": "BaseBdev1", 00:10:44.274 "aliases": [ 00:10:44.274 "45f61258-6678-4c95-be83-dcc82bfa0487" 00:10:44.274 ], 00:10:44.274 "product_name": "Malloc disk", 00:10:44.274 "block_size": 512, 00:10:44.274 "num_blocks": 65536, 00:10:44.274 "uuid": "45f61258-6678-4c95-be83-dcc82bfa0487", 00:10:44.274 "assigned_rate_limits": { 00:10:44.274 "rw_ios_per_sec": 0, 00:10:44.274 "rw_mbytes_per_sec": 0, 00:10:44.274 "r_mbytes_per_sec": 0, 00:10:44.274 "w_mbytes_per_sec": 0 00:10:44.274 }, 00:10:44.274 "claimed": true, 00:10:44.274 "claim_type": "exclusive_write", 00:10:44.274 "zoned": false, 00:10:44.274 "supported_io_types": { 00:10:44.274 "read": true, 00:10:44.274 "write": true, 00:10:44.274 "unmap": true, 00:10:44.274 "flush": true, 00:10:44.274 "reset": true, 00:10:44.274 "nvme_admin": false, 00:10:44.274 "nvme_io": false, 00:10:44.274 "nvme_io_md": false, 00:10:44.274 "write_zeroes": true, 00:10:44.274 "zcopy": true, 00:10:44.274 "get_zone_info": false, 00:10:44.274 "zone_management": false, 00:10:44.274 "zone_append": false, 00:10:44.274 "compare": false, 00:10:44.274 "compare_and_write": false, 00:10:44.274 "abort": true, 00:10:44.274 "seek_hole": false, 00:10:44.274 "seek_data": false, 00:10:44.274 "copy": true, 00:10:44.274 "nvme_iov_md": false 00:10:44.274 }, 00:10:44.274 "memory_domains": [ 00:10:44.274 { 00:10:44.274 "dma_device_id": "system", 00:10:44.274 "dma_device_type": 1 00:10:44.274 }, 00:10:44.274 { 00:10:44.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.274 "dma_device_type": 2 00:10:44.274 } 00:10:44.274 ], 00:10:44.274 "driver_specific": {} 00:10:44.274 }' 00:10:44.274 07:46:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:44.274 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:44.538 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:44.538 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:44.538 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:44.538 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:44.539 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:44.539 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:44.539 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:44.539 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.539 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.799 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:44.799 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:44.799 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:44.799 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:44.799 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:44.799 "name": "BaseBdev2", 00:10:44.799 "aliases": [ 00:10:44.799 "35c96ea2-1314-4403-90f6-f510db61dc5b" 00:10:44.799 ], 00:10:44.799 "product_name": "Malloc disk", 00:10:44.799 "block_size": 512, 00:10:44.799 "num_blocks": 65536, 00:10:44.799 "uuid": "35c96ea2-1314-4403-90f6-f510db61dc5b", 00:10:44.799 "assigned_rate_limits": { 00:10:44.799 "rw_ios_per_sec": 0, 00:10:44.799 "rw_mbytes_per_sec": 0, 00:10:44.799 "r_mbytes_per_sec": 0, 00:10:44.799 "w_mbytes_per_sec": 0 00:10:44.799 }, 00:10:44.799 "claimed": true, 00:10:44.799 "claim_type": "exclusive_write", 00:10:44.799 "zoned": false, 00:10:44.799 "supported_io_types": { 00:10:44.799 "read": true, 00:10:44.799 "write": true, 00:10:44.799 "unmap": true, 00:10:44.799 "flush": true, 00:10:44.799 "reset": true, 00:10:44.799 "nvme_admin": false, 00:10:44.799 "nvme_io": false, 00:10:44.799 "nvme_io_md": false, 00:10:44.799 "write_zeroes": true, 00:10:44.799 "zcopy": true, 00:10:44.799 "get_zone_info": false, 00:10:44.799 "zone_management": false, 00:10:44.799 "zone_append": false, 00:10:44.799 "compare": false, 00:10:44.799 "compare_and_write": false, 00:10:44.799 "abort": true, 00:10:44.799 "seek_hole": false, 00:10:44.799 "seek_data": false, 00:10:44.799 "copy": true, 00:10:44.799 "nvme_iov_md": false 00:10:44.799 }, 00:10:44.799 "memory_domains": [ 00:10:44.799 { 00:10:44.799 "dma_device_id": "system", 00:10:44.799 "dma_device_type": 1 00:10:44.799 }, 00:10:44.799 { 00:10:44.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.799 "dma_device_type": 2 00:10:44.799 } 00:10:44.799 ], 00:10:44.799 "driver_specific": {} 00:10:44.799 }' 00:10:44.799 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:45.059 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:45.319 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:45.319 07:46:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:45.319 [2024-07-15 07:46:30.006547] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:45.319 [2024-07-15 07:46:30.006564] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:45.319 [2024-07-15 07:46:30.006592] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.319 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:45.578 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:45.578 "name": "Existed_Raid", 00:10:45.578 "uuid": "256343f0-9ae8-4952-92fc-ca8502215147", 00:10:45.578 "strip_size_kb": 64, 00:10:45.578 "state": "offline", 00:10:45.578 "raid_level": "raid0", 00:10:45.578 "superblock": true, 00:10:45.578 "num_base_bdevs": 2, 00:10:45.578 "num_base_bdevs_discovered": 1, 00:10:45.578 "num_base_bdevs_operational": 1, 00:10:45.578 "base_bdevs_list": [ 00:10:45.578 { 00:10:45.578 "name": null, 00:10:45.578 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.578 "is_configured": false, 00:10:45.578 "data_offset": 2048, 00:10:45.578 "data_size": 63488 00:10:45.578 }, 00:10:45.578 { 00:10:45.578 "name": "BaseBdev2", 00:10:45.578 "uuid": "35c96ea2-1314-4403-90f6-f510db61dc5b", 00:10:45.578 "is_configured": true, 00:10:45.578 "data_offset": 2048, 00:10:45.578 "data_size": 63488 00:10:45.578 } 00:10:45.578 ] 00:10:45.578 }' 00:10:45.578 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:45.578 07:46:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:46.147 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:46.147 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:46.147 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.147 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:46.407 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:46.407 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:46.407 07:46:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:46.407 [2024-07-15 07:46:31.153458] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:46.407 [2024-07-15 07:46:31.153488] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbced90 name Existed_Raid, state offline 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1587561 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1587561 ']' 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1587561 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1587561 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1587561' 00:10:46.667 killing process with pid 1587561 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1587561 00:10:46.667 [2024-07-15 07:46:31.404920] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:46.667 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1587561 00:10:46.667 [2024-07-15 07:46:31.405506] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:46.928 07:46:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:46.928 00:10:46.928 real 0m8.870s 00:10:46.928 user 0m16.085s 00:10:46.928 sys 0m1.387s 00:10:46.928 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:46.928 07:46:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:46.928 ************************************ 00:10:46.928 END TEST raid_state_function_test_sb 00:10:46.928 ************************************ 00:10:46.928 07:46:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:46.928 07:46:31 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:10:46.928 07:46:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:46.928 07:46:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:46.928 07:46:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:46.928 ************************************ 00:10:46.928 START TEST raid_superblock_test 00:10:46.928 ************************************ 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1589256 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1589256 /var/tmp/spdk-raid.sock 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1589256 ']' 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:46.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:46.928 07:46:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.928 [2024-07-15 07:46:31.659078] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:46.928 [2024-07-15 07:46:31.659127] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1589256 ] 00:10:47.189 [2024-07-15 07:46:31.747075] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:47.189 [2024-07-15 07:46:31.839895] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:47.189 [2024-07-15 07:46:31.892353] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:47.189 [2024-07-15 07:46:31.892384] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:47.761 07:46:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:47.761 07:46:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:10:47.761 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:48.021 malloc1 00:10:48.021 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:48.281 [2024-07-15 07:46:32.921364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:48.281 [2024-07-15 07:46:32.921415] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:48.281 [2024-07-15 07:46:32.921429] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad3a20 00:10:48.281 [2024-07-15 07:46:32.921437] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:48.281 [2024-07-15 07:46:32.922967] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:48.281 [2024-07-15 07:46:32.923009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:48.281 pt1 00:10:48.281 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:48.281 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:48.281 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:48.281 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:48.281 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:48.281 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:48.282 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:48.282 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:48.282 07:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:48.542 malloc2 00:10:48.542 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:48.803 [2024-07-15 07:46:33.327940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:48.803 [2024-07-15 07:46:33.327984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:48.803 [2024-07-15 07:46:33.327999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad4040 00:10:48.803 [2024-07-15 07:46:33.328006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:48.803 [2024-07-15 07:46:33.329368] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:48.803 [2024-07-15 07:46:33.329403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:48.803 pt2 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:48.803 [2024-07-15 07:46:33.532479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:48.803 [2024-07-15 07:46:33.533645] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:48.803 [2024-07-15 07:46:33.533795] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc803d0 00:10:48.803 [2024-07-15 07:46:33.533804] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:48.803 [2024-07-15 07:46:33.533978] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7f7f0 00:10:48.803 [2024-07-15 07:46:33.534101] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc803d0 00:10:48.803 [2024-07-15 07:46:33.534106] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc803d0 00:10:48.803 [2024-07-15 07:46:33.534182] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.803 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:49.064 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:49.064 "name": "raid_bdev1", 00:10:49.064 "uuid": "1d50e515-7714-4c01-8926-a460832f77bc", 00:10:49.064 "strip_size_kb": 64, 00:10:49.064 "state": "online", 00:10:49.064 "raid_level": "raid0", 00:10:49.064 "superblock": true, 00:10:49.064 "num_base_bdevs": 2, 00:10:49.064 "num_base_bdevs_discovered": 2, 00:10:49.064 "num_base_bdevs_operational": 2, 00:10:49.064 "base_bdevs_list": [ 00:10:49.064 { 00:10:49.064 "name": "pt1", 00:10:49.064 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:49.064 "is_configured": true, 00:10:49.064 "data_offset": 2048, 00:10:49.064 "data_size": 63488 00:10:49.064 }, 00:10:49.064 { 00:10:49.064 "name": "pt2", 00:10:49.064 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:49.064 "is_configured": true, 00:10:49.064 "data_offset": 2048, 00:10:49.064 "data_size": 63488 00:10:49.064 } 00:10:49.064 ] 00:10:49.064 }' 00:10:49.064 07:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:49.064 07:46:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.635 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:49.635 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:49.635 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:49.635 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:49.635 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:49.635 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:49.635 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:49.635 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:49.897 [2024-07-15 07:46:34.507159] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:49.897 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:49.897 "name": "raid_bdev1", 00:10:49.897 "aliases": [ 00:10:49.897 "1d50e515-7714-4c01-8926-a460832f77bc" 00:10:49.897 ], 00:10:49.897 "product_name": "Raid Volume", 00:10:49.897 "block_size": 512, 00:10:49.897 "num_blocks": 126976, 00:10:49.897 "uuid": "1d50e515-7714-4c01-8926-a460832f77bc", 00:10:49.897 "assigned_rate_limits": { 00:10:49.897 "rw_ios_per_sec": 0, 00:10:49.897 "rw_mbytes_per_sec": 0, 00:10:49.897 "r_mbytes_per_sec": 0, 00:10:49.897 "w_mbytes_per_sec": 0 00:10:49.897 }, 00:10:49.897 "claimed": false, 00:10:49.897 "zoned": false, 00:10:49.897 "supported_io_types": { 00:10:49.897 "read": true, 00:10:49.897 "write": true, 00:10:49.897 "unmap": true, 00:10:49.897 "flush": true, 00:10:49.897 "reset": true, 00:10:49.897 "nvme_admin": false, 00:10:49.897 "nvme_io": false, 00:10:49.897 "nvme_io_md": false, 00:10:49.897 "write_zeroes": true, 00:10:49.897 "zcopy": false, 00:10:49.897 "get_zone_info": false, 00:10:49.897 "zone_management": false, 00:10:49.897 "zone_append": false, 00:10:49.897 "compare": false, 00:10:49.897 "compare_and_write": false, 00:10:49.897 "abort": false, 00:10:49.897 "seek_hole": false, 00:10:49.897 "seek_data": false, 00:10:49.897 "copy": false, 00:10:49.897 "nvme_iov_md": false 00:10:49.897 }, 00:10:49.897 "memory_domains": [ 00:10:49.897 { 00:10:49.897 "dma_device_id": "system", 00:10:49.897 "dma_device_type": 1 00:10:49.897 }, 00:10:49.897 { 00:10:49.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.897 "dma_device_type": 2 00:10:49.897 }, 00:10:49.897 { 00:10:49.897 "dma_device_id": "system", 00:10:49.897 "dma_device_type": 1 00:10:49.897 }, 00:10:49.897 { 00:10:49.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.897 "dma_device_type": 2 00:10:49.897 } 00:10:49.897 ], 00:10:49.897 "driver_specific": { 00:10:49.897 "raid": { 00:10:49.897 "uuid": "1d50e515-7714-4c01-8926-a460832f77bc", 00:10:49.897 "strip_size_kb": 64, 00:10:49.897 "state": "online", 00:10:49.897 "raid_level": "raid0", 00:10:49.897 "superblock": true, 00:10:49.897 "num_base_bdevs": 2, 00:10:49.897 "num_base_bdevs_discovered": 2, 00:10:49.897 "num_base_bdevs_operational": 2, 00:10:49.897 "base_bdevs_list": [ 00:10:49.897 { 00:10:49.897 "name": "pt1", 00:10:49.897 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:49.897 "is_configured": true, 00:10:49.897 "data_offset": 2048, 00:10:49.897 "data_size": 63488 00:10:49.897 }, 00:10:49.897 { 00:10:49.897 "name": "pt2", 00:10:49.897 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:49.897 "is_configured": true, 00:10:49.897 "data_offset": 2048, 00:10:49.897 "data_size": 63488 00:10:49.897 } 00:10:49.897 ] 00:10:49.897 } 00:10:49.897 } 00:10:49.897 }' 00:10:49.897 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:49.897 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:49.897 pt2' 00:10:49.897 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:49.897 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:49.897 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:50.158 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:50.158 "name": "pt1", 00:10:50.158 "aliases": [ 00:10:50.158 "00000000-0000-0000-0000-000000000001" 00:10:50.158 ], 00:10:50.158 "product_name": "passthru", 00:10:50.158 "block_size": 512, 00:10:50.158 "num_blocks": 65536, 00:10:50.158 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:50.158 "assigned_rate_limits": { 00:10:50.158 "rw_ios_per_sec": 0, 00:10:50.158 "rw_mbytes_per_sec": 0, 00:10:50.158 "r_mbytes_per_sec": 0, 00:10:50.158 "w_mbytes_per_sec": 0 00:10:50.158 }, 00:10:50.158 "claimed": true, 00:10:50.158 "claim_type": "exclusive_write", 00:10:50.158 "zoned": false, 00:10:50.158 "supported_io_types": { 00:10:50.158 "read": true, 00:10:50.158 "write": true, 00:10:50.158 "unmap": true, 00:10:50.158 "flush": true, 00:10:50.158 "reset": true, 00:10:50.158 "nvme_admin": false, 00:10:50.158 "nvme_io": false, 00:10:50.158 "nvme_io_md": false, 00:10:50.158 "write_zeroes": true, 00:10:50.158 "zcopy": true, 00:10:50.158 "get_zone_info": false, 00:10:50.158 "zone_management": false, 00:10:50.158 "zone_append": false, 00:10:50.158 "compare": false, 00:10:50.158 "compare_and_write": false, 00:10:50.158 "abort": true, 00:10:50.158 "seek_hole": false, 00:10:50.158 "seek_data": false, 00:10:50.158 "copy": true, 00:10:50.158 "nvme_iov_md": false 00:10:50.158 }, 00:10:50.158 "memory_domains": [ 00:10:50.158 { 00:10:50.158 "dma_device_id": "system", 00:10:50.158 "dma_device_type": 1 00:10:50.158 }, 00:10:50.158 { 00:10:50.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.158 "dma_device_type": 2 00:10:50.158 } 00:10:50.158 ], 00:10:50.158 "driver_specific": { 00:10:50.158 "passthru": { 00:10:50.158 "name": "pt1", 00:10:50.158 "base_bdev_name": "malloc1" 00:10:50.158 } 00:10:50.158 } 00:10:50.158 }' 00:10:50.158 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.158 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.158 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:50.159 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.418 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.419 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:50.419 07:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.419 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.419 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:50.419 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.419 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.419 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.419 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:50.419 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:50.419 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:50.678 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:50.678 "name": "pt2", 00:10:50.678 "aliases": [ 00:10:50.678 "00000000-0000-0000-0000-000000000002" 00:10:50.678 ], 00:10:50.678 "product_name": "passthru", 00:10:50.678 "block_size": 512, 00:10:50.678 "num_blocks": 65536, 00:10:50.678 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:50.678 "assigned_rate_limits": { 00:10:50.678 "rw_ios_per_sec": 0, 00:10:50.678 "rw_mbytes_per_sec": 0, 00:10:50.678 "r_mbytes_per_sec": 0, 00:10:50.678 "w_mbytes_per_sec": 0 00:10:50.678 }, 00:10:50.678 "claimed": true, 00:10:50.678 "claim_type": "exclusive_write", 00:10:50.678 "zoned": false, 00:10:50.678 "supported_io_types": { 00:10:50.678 "read": true, 00:10:50.678 "write": true, 00:10:50.678 "unmap": true, 00:10:50.678 "flush": true, 00:10:50.678 "reset": true, 00:10:50.678 "nvme_admin": false, 00:10:50.678 "nvme_io": false, 00:10:50.678 "nvme_io_md": false, 00:10:50.678 "write_zeroes": true, 00:10:50.678 "zcopy": true, 00:10:50.679 "get_zone_info": false, 00:10:50.679 "zone_management": false, 00:10:50.679 "zone_append": false, 00:10:50.679 "compare": false, 00:10:50.679 "compare_and_write": false, 00:10:50.679 "abort": true, 00:10:50.679 "seek_hole": false, 00:10:50.679 "seek_data": false, 00:10:50.679 "copy": true, 00:10:50.679 "nvme_iov_md": false 00:10:50.679 }, 00:10:50.679 "memory_domains": [ 00:10:50.679 { 00:10:50.679 "dma_device_id": "system", 00:10:50.679 "dma_device_type": 1 00:10:50.679 }, 00:10:50.679 { 00:10:50.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.679 "dma_device_type": 2 00:10:50.679 } 00:10:50.679 ], 00:10:50.679 "driver_specific": { 00:10:50.679 "passthru": { 00:10:50.679 "name": "pt2", 00:10:50.679 "base_bdev_name": "malloc2" 00:10:50.679 } 00:10:50.679 } 00:10:50.679 }' 00:10:50.679 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.679 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.938 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:51.198 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:51.198 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:51.198 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:51.198 [2024-07-15 07:46:35.886611] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:51.198 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1d50e515-7714-4c01-8926-a460832f77bc 00:10:51.198 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1d50e515-7714-4c01-8926-a460832f77bc ']' 00:10:51.198 07:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:51.457 [2024-07-15 07:46:36.078895] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:51.457 [2024-07-15 07:46:36.078909] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:51.457 [2024-07-15 07:46:36.078948] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:51.457 [2024-07-15 07:46:36.078981] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:51.457 [2024-07-15 07:46:36.078987] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc803d0 name raid_bdev1, state offline 00:10:51.457 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.457 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:51.716 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:51.716 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:51.716 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:51.716 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:51.975 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:51.975 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:51.975 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:51.975 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:52.234 07:46:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:52.493 [2024-07-15 07:46:37.045311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:52.493 [2024-07-15 07:46:37.046407] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:52.493 [2024-07-15 07:46:37.046449] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:52.493 [2024-07-15 07:46:37.046477] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:52.493 [2024-07-15 07:46:37.046488] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:52.493 [2024-07-15 07:46:37.046493] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad3070 name raid_bdev1, state configuring 00:10:52.493 request: 00:10:52.493 { 00:10:52.493 "name": "raid_bdev1", 00:10:52.493 "raid_level": "raid0", 00:10:52.493 "base_bdevs": [ 00:10:52.493 "malloc1", 00:10:52.493 "malloc2" 00:10:52.493 ], 00:10:52.493 "strip_size_kb": 64, 00:10:52.493 "superblock": false, 00:10:52.493 "method": "bdev_raid_create", 00:10:52.493 "req_id": 1 00:10:52.493 } 00:10:52.493 Got JSON-RPC error response 00:10:52.493 response: 00:10:52.493 { 00:10:52.493 "code": -17, 00:10:52.493 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:52.493 } 00:10:52.493 07:46:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:52.493 07:46:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:52.493 07:46:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:52.493 07:46:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:52.493 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.493 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:52.752 [2024-07-15 07:46:37.414204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:52.752 [2024-07-15 07:46:37.414235] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:52.752 [2024-07-15 07:46:37.414246] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad4e00 00:10:52.752 [2024-07-15 07:46:37.414252] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:52.752 [2024-07-15 07:46:37.415512] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:52.752 [2024-07-15 07:46:37.415532] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:52.752 [2024-07-15 07:46:37.415578] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:52.752 [2024-07-15 07:46:37.415595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:52.752 pt1 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.752 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:53.011 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:53.011 "name": "raid_bdev1", 00:10:53.011 "uuid": "1d50e515-7714-4c01-8926-a460832f77bc", 00:10:53.011 "strip_size_kb": 64, 00:10:53.011 "state": "configuring", 00:10:53.011 "raid_level": "raid0", 00:10:53.011 "superblock": true, 00:10:53.011 "num_base_bdevs": 2, 00:10:53.011 "num_base_bdevs_discovered": 1, 00:10:53.011 "num_base_bdevs_operational": 2, 00:10:53.011 "base_bdevs_list": [ 00:10:53.011 { 00:10:53.011 "name": "pt1", 00:10:53.011 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:53.011 "is_configured": true, 00:10:53.011 "data_offset": 2048, 00:10:53.011 "data_size": 63488 00:10:53.011 }, 00:10:53.011 { 00:10:53.011 "name": null, 00:10:53.011 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:53.011 "is_configured": false, 00:10:53.011 "data_offset": 2048, 00:10:53.011 "data_size": 63488 00:10:53.011 } 00:10:53.011 ] 00:10:53.011 }' 00:10:53.011 07:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:53.011 07:46:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:53.578 [2024-07-15 07:46:38.308517] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:53.578 [2024-07-15 07:46:38.308547] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:53.578 [2024-07-15 07:46:38.308557] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xad43e0 00:10:53.578 [2024-07-15 07:46:38.308563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:53.578 [2024-07-15 07:46:38.308826] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:53.578 [2024-07-15 07:46:38.308839] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:53.578 [2024-07-15 07:46:38.308880] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:53.578 [2024-07-15 07:46:38.308893] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:53.578 [2024-07-15 07:46:38.308966] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xad2680 00:10:53.578 [2024-07-15 07:46:38.308973] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:53.578 [2024-07-15 07:46:38.309104] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7f7f0 00:10:53.578 [2024-07-15 07:46:38.309200] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xad2680 00:10:53.578 [2024-07-15 07:46:38.309206] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xad2680 00:10:53.578 [2024-07-15 07:46:38.309276] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:53.578 pt2 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.578 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:53.837 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:53.837 "name": "raid_bdev1", 00:10:53.837 "uuid": "1d50e515-7714-4c01-8926-a460832f77bc", 00:10:53.837 "strip_size_kb": 64, 00:10:53.837 "state": "online", 00:10:53.837 "raid_level": "raid0", 00:10:53.837 "superblock": true, 00:10:53.837 "num_base_bdevs": 2, 00:10:53.837 "num_base_bdevs_discovered": 2, 00:10:53.837 "num_base_bdevs_operational": 2, 00:10:53.837 "base_bdevs_list": [ 00:10:53.837 { 00:10:53.837 "name": "pt1", 00:10:53.837 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:53.837 "is_configured": true, 00:10:53.837 "data_offset": 2048, 00:10:53.837 "data_size": 63488 00:10:53.837 }, 00:10:53.837 { 00:10:53.837 "name": "pt2", 00:10:53.837 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:53.837 "is_configured": true, 00:10:53.837 "data_offset": 2048, 00:10:53.837 "data_size": 63488 00:10:53.837 } 00:10:53.837 ] 00:10:53.837 }' 00:10:53.837 07:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:53.837 07:46:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.404 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:54.404 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:54.404 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:54.404 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:54.404 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:54.404 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:54.404 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:54.404 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:54.662 [2024-07-15 07:46:39.198964] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:54.662 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:54.662 "name": "raid_bdev1", 00:10:54.662 "aliases": [ 00:10:54.662 "1d50e515-7714-4c01-8926-a460832f77bc" 00:10:54.662 ], 00:10:54.662 "product_name": "Raid Volume", 00:10:54.662 "block_size": 512, 00:10:54.662 "num_blocks": 126976, 00:10:54.662 "uuid": "1d50e515-7714-4c01-8926-a460832f77bc", 00:10:54.662 "assigned_rate_limits": { 00:10:54.662 "rw_ios_per_sec": 0, 00:10:54.662 "rw_mbytes_per_sec": 0, 00:10:54.662 "r_mbytes_per_sec": 0, 00:10:54.662 "w_mbytes_per_sec": 0 00:10:54.662 }, 00:10:54.662 "claimed": false, 00:10:54.662 "zoned": false, 00:10:54.662 "supported_io_types": { 00:10:54.662 "read": true, 00:10:54.662 "write": true, 00:10:54.662 "unmap": true, 00:10:54.662 "flush": true, 00:10:54.662 "reset": true, 00:10:54.662 "nvme_admin": false, 00:10:54.662 "nvme_io": false, 00:10:54.662 "nvme_io_md": false, 00:10:54.662 "write_zeroes": true, 00:10:54.662 "zcopy": false, 00:10:54.663 "get_zone_info": false, 00:10:54.663 "zone_management": false, 00:10:54.663 "zone_append": false, 00:10:54.663 "compare": false, 00:10:54.663 "compare_and_write": false, 00:10:54.663 "abort": false, 00:10:54.663 "seek_hole": false, 00:10:54.663 "seek_data": false, 00:10:54.663 "copy": false, 00:10:54.663 "nvme_iov_md": false 00:10:54.663 }, 00:10:54.663 "memory_domains": [ 00:10:54.663 { 00:10:54.663 "dma_device_id": "system", 00:10:54.663 "dma_device_type": 1 00:10:54.663 }, 00:10:54.663 { 00:10:54.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.663 "dma_device_type": 2 00:10:54.663 }, 00:10:54.663 { 00:10:54.663 "dma_device_id": "system", 00:10:54.663 "dma_device_type": 1 00:10:54.663 }, 00:10:54.663 { 00:10:54.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.663 "dma_device_type": 2 00:10:54.663 } 00:10:54.663 ], 00:10:54.663 "driver_specific": { 00:10:54.663 "raid": { 00:10:54.663 "uuid": "1d50e515-7714-4c01-8926-a460832f77bc", 00:10:54.663 "strip_size_kb": 64, 00:10:54.663 "state": "online", 00:10:54.663 "raid_level": "raid0", 00:10:54.663 "superblock": true, 00:10:54.663 "num_base_bdevs": 2, 00:10:54.663 "num_base_bdevs_discovered": 2, 00:10:54.663 "num_base_bdevs_operational": 2, 00:10:54.663 "base_bdevs_list": [ 00:10:54.663 { 00:10:54.663 "name": "pt1", 00:10:54.663 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:54.663 "is_configured": true, 00:10:54.663 "data_offset": 2048, 00:10:54.663 "data_size": 63488 00:10:54.663 }, 00:10:54.663 { 00:10:54.663 "name": "pt2", 00:10:54.663 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:54.663 "is_configured": true, 00:10:54.663 "data_offset": 2048, 00:10:54.663 "data_size": 63488 00:10:54.663 } 00:10:54.663 ] 00:10:54.663 } 00:10:54.663 } 00:10:54.663 }' 00:10:54.663 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:54.663 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:54.663 pt2' 00:10:54.663 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:54.663 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:54.663 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:54.921 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:54.921 "name": "pt1", 00:10:54.921 "aliases": [ 00:10:54.921 "00000000-0000-0000-0000-000000000001" 00:10:54.921 ], 00:10:54.921 "product_name": "passthru", 00:10:54.921 "block_size": 512, 00:10:54.921 "num_blocks": 65536, 00:10:54.921 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:54.921 "assigned_rate_limits": { 00:10:54.921 "rw_ios_per_sec": 0, 00:10:54.921 "rw_mbytes_per_sec": 0, 00:10:54.921 "r_mbytes_per_sec": 0, 00:10:54.921 "w_mbytes_per_sec": 0 00:10:54.921 }, 00:10:54.921 "claimed": true, 00:10:54.921 "claim_type": "exclusive_write", 00:10:54.921 "zoned": false, 00:10:54.921 "supported_io_types": { 00:10:54.921 "read": true, 00:10:54.921 "write": true, 00:10:54.921 "unmap": true, 00:10:54.921 "flush": true, 00:10:54.921 "reset": true, 00:10:54.921 "nvme_admin": false, 00:10:54.921 "nvme_io": false, 00:10:54.921 "nvme_io_md": false, 00:10:54.921 "write_zeroes": true, 00:10:54.921 "zcopy": true, 00:10:54.921 "get_zone_info": false, 00:10:54.921 "zone_management": false, 00:10:54.921 "zone_append": false, 00:10:54.921 "compare": false, 00:10:54.921 "compare_and_write": false, 00:10:54.921 "abort": true, 00:10:54.921 "seek_hole": false, 00:10:54.921 "seek_data": false, 00:10:54.921 "copy": true, 00:10:54.921 "nvme_iov_md": false 00:10:54.921 }, 00:10:54.921 "memory_domains": [ 00:10:54.921 { 00:10:54.921 "dma_device_id": "system", 00:10:54.921 "dma_device_type": 1 00:10:54.921 }, 00:10:54.921 { 00:10:54.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.921 "dma_device_type": 2 00:10:54.921 } 00:10:54.921 ], 00:10:54.921 "driver_specific": { 00:10:54.921 "passthru": { 00:10:54.921 "name": "pt1", 00:10:54.922 "base_bdev_name": "malloc1" 00:10:54.922 } 00:10:54.922 } 00:10:54.922 }' 00:10:54.922 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:54.922 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:54.922 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:54.922 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:54.922 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:54.922 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:54.922 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:54.922 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.179 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.179 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.179 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.179 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.179 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:55.179 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:55.179 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:55.438 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:55.438 "name": "pt2", 00:10:55.438 "aliases": [ 00:10:55.438 "00000000-0000-0000-0000-000000000002" 00:10:55.438 ], 00:10:55.438 "product_name": "passthru", 00:10:55.438 "block_size": 512, 00:10:55.438 "num_blocks": 65536, 00:10:55.438 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:55.438 "assigned_rate_limits": { 00:10:55.438 "rw_ios_per_sec": 0, 00:10:55.438 "rw_mbytes_per_sec": 0, 00:10:55.438 "r_mbytes_per_sec": 0, 00:10:55.438 "w_mbytes_per_sec": 0 00:10:55.438 }, 00:10:55.438 "claimed": true, 00:10:55.438 "claim_type": "exclusive_write", 00:10:55.438 "zoned": false, 00:10:55.438 "supported_io_types": { 00:10:55.438 "read": true, 00:10:55.438 "write": true, 00:10:55.438 "unmap": true, 00:10:55.438 "flush": true, 00:10:55.438 "reset": true, 00:10:55.438 "nvme_admin": false, 00:10:55.438 "nvme_io": false, 00:10:55.438 "nvme_io_md": false, 00:10:55.438 "write_zeroes": true, 00:10:55.438 "zcopy": true, 00:10:55.438 "get_zone_info": false, 00:10:55.438 "zone_management": false, 00:10:55.438 "zone_append": false, 00:10:55.438 "compare": false, 00:10:55.438 "compare_and_write": false, 00:10:55.438 "abort": true, 00:10:55.438 "seek_hole": false, 00:10:55.438 "seek_data": false, 00:10:55.438 "copy": true, 00:10:55.438 "nvme_iov_md": false 00:10:55.438 }, 00:10:55.438 "memory_domains": [ 00:10:55.438 { 00:10:55.438 "dma_device_id": "system", 00:10:55.438 "dma_device_type": 1 00:10:55.438 }, 00:10:55.438 { 00:10:55.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.438 "dma_device_type": 2 00:10:55.438 } 00:10:55.438 ], 00:10:55.438 "driver_specific": { 00:10:55.438 "passthru": { 00:10:55.438 "name": "pt2", 00:10:55.438 "base_bdev_name": "malloc2" 00:10:55.438 } 00:10:55.438 } 00:10:55.438 }' 00:10:55.438 07:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.438 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.438 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:55.438 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.438 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.438 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:55.438 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.697 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.697 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.697 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.697 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.697 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.697 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:55.697 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:55.957 [2024-07-15 07:46:40.494254] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1d50e515-7714-4c01-8926-a460832f77bc '!=' 1d50e515-7714-4c01-8926-a460832f77bc ']' 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1589256 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1589256 ']' 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1589256 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1589256 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1589256' 00:10:55.957 killing process with pid 1589256 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1589256 00:10:55.957 [2024-07-15 07:46:40.564859] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:55.957 [2024-07-15 07:46:40.564900] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:55.957 [2024-07-15 07:46:40.564933] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:55.957 [2024-07-15 07:46:40.564939] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad2680 name raid_bdev1, state offline 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1589256 00:10:55.957 [2024-07-15 07:46:40.574209] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:55.957 00:10:55.957 real 0m9.089s 00:10:55.957 user 0m16.501s 00:10:55.957 sys 0m1.478s 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:55.957 07:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:55.957 ************************************ 00:10:55.957 END TEST raid_superblock_test 00:10:55.957 ************************************ 00:10:56.217 07:46:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:56.217 07:46:40 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:10:56.217 07:46:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:56.217 07:46:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:56.217 07:46:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:56.217 ************************************ 00:10:56.217 START TEST raid_read_error_test 00:10:56.217 ************************************ 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.s09eddTVpu 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1591004 00:10:56.217 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1591004 /var/tmp/spdk-raid.sock 00:10:56.218 07:46:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:56.218 07:46:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1591004 ']' 00:10:56.218 07:46:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:56.218 07:46:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:56.218 07:46:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:56.218 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:56.218 07:46:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:56.218 07:46:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:56.218 [2024-07-15 07:46:40.833687] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:10:56.218 [2024-07-15 07:46:40.833746] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591004 ] 00:10:56.218 [2024-07-15 07:46:40.927244] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:56.477 [2024-07-15 07:46:41.001739] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:56.477 [2024-07-15 07:46:41.044941] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:56.477 [2024-07-15 07:46:41.044966] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:57.046 07:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:57.046 07:46:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:10:57.046 07:46:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:57.046 07:46:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:57.306 BaseBdev1_malloc 00:10:57.306 07:46:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:57.306 true 00:10:57.306 07:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:57.567 [2024-07-15 07:46:42.212457] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:57.567 [2024-07-15 07:46:42.212491] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:57.567 [2024-07-15 07:46:42.212501] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db0b50 00:10:57.567 [2024-07-15 07:46:42.212508] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:57.567 [2024-07-15 07:46:42.213786] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:57.567 [2024-07-15 07:46:42.213806] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:57.567 BaseBdev1 00:10:57.567 07:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:57.567 07:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:57.826 BaseBdev2_malloc 00:10:57.827 07:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:58.086 true 00:10:58.086 07:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:58.086 [2024-07-15 07:46:42.811663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:58.086 [2024-07-15 07:46:42.811692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:58.086 [2024-07-15 07:46:42.811702] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d94ea0 00:10:58.086 [2024-07-15 07:46:42.811711] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:58.086 [2024-07-15 07:46:42.812860] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:58.086 [2024-07-15 07:46:42.812878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:58.086 BaseBdev2 00:10:58.086 07:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:58.345 [2024-07-15 07:46:43.004166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:58.345 [2024-07-15 07:46:43.005137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:58.345 [2024-07-15 07:46:43.005269] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bfe360 00:10:58.345 [2024-07-15 07:46:43.005277] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:58.345 [2024-07-15 07:46:43.005411] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bfb8a0 00:10:58.345 [2024-07-15 07:46:43.005521] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bfe360 00:10:58.345 [2024-07-15 07:46:43.005527] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bfe360 00:10:58.345 [2024-07-15 07:46:43.005600] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.345 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:58.604 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:58.604 "name": "raid_bdev1", 00:10:58.604 "uuid": "7a0c6ab0-588a-4c79-9975-01040427c75d", 00:10:58.604 "strip_size_kb": 64, 00:10:58.604 "state": "online", 00:10:58.604 "raid_level": "raid0", 00:10:58.604 "superblock": true, 00:10:58.604 "num_base_bdevs": 2, 00:10:58.604 "num_base_bdevs_discovered": 2, 00:10:58.604 "num_base_bdevs_operational": 2, 00:10:58.604 "base_bdevs_list": [ 00:10:58.604 { 00:10:58.604 "name": "BaseBdev1", 00:10:58.604 "uuid": "632e4839-e1e0-5845-8cdb-3357516f50d3", 00:10:58.604 "is_configured": true, 00:10:58.604 "data_offset": 2048, 00:10:58.604 "data_size": 63488 00:10:58.604 }, 00:10:58.604 { 00:10:58.604 "name": "BaseBdev2", 00:10:58.604 "uuid": "b5ce403b-a545-5e31-a0dc-82bdecc423b0", 00:10:58.604 "is_configured": true, 00:10:58.604 "data_offset": 2048, 00:10:58.604 "data_size": 63488 00:10:58.604 } 00:10:58.604 ] 00:10:58.604 }' 00:10:58.604 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:58.604 07:46:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:59.171 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:59.171 07:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:59.171 [2024-07-15 07:46:43.810434] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d96270 00:11:00.110 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.373 07:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:00.373 07:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:00.373 "name": "raid_bdev1", 00:11:00.373 "uuid": "7a0c6ab0-588a-4c79-9975-01040427c75d", 00:11:00.373 "strip_size_kb": 64, 00:11:00.373 "state": "online", 00:11:00.373 "raid_level": "raid0", 00:11:00.373 "superblock": true, 00:11:00.373 "num_base_bdevs": 2, 00:11:00.373 "num_base_bdevs_discovered": 2, 00:11:00.373 "num_base_bdevs_operational": 2, 00:11:00.373 "base_bdevs_list": [ 00:11:00.373 { 00:11:00.373 "name": "BaseBdev1", 00:11:00.373 "uuid": "632e4839-e1e0-5845-8cdb-3357516f50d3", 00:11:00.373 "is_configured": true, 00:11:00.373 "data_offset": 2048, 00:11:00.373 "data_size": 63488 00:11:00.373 }, 00:11:00.373 { 00:11:00.373 "name": "BaseBdev2", 00:11:00.373 "uuid": "b5ce403b-a545-5e31-a0dc-82bdecc423b0", 00:11:00.373 "is_configured": true, 00:11:00.373 "data_offset": 2048, 00:11:00.373 "data_size": 63488 00:11:00.373 } 00:11:00.373 ] 00:11:00.373 }' 00:11:00.373 07:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:00.373 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.941 07:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:01.201 [2024-07-15 07:46:45.826069] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:01.201 [2024-07-15 07:46:45.826101] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:01.201 [2024-07-15 07:46:45.828678] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:01.201 [2024-07-15 07:46:45.828702] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:01.201 [2024-07-15 07:46:45.828728] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:01.201 [2024-07-15 07:46:45.828734] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bfe360 name raid_bdev1, state offline 00:11:01.201 0 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1591004 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1591004 ']' 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1591004 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1591004 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1591004' 00:11:01.201 killing process with pid 1591004 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1591004 00:11:01.201 [2024-07-15 07:46:45.913471] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:01.201 07:46:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1591004 00:11:01.201 [2024-07-15 07:46:45.919316] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.s09eddTVpu 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:01.461 00:11:01.461 real 0m5.285s 00:11:01.461 user 0m8.291s 00:11:01.461 sys 0m0.742s 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:01.461 07:46:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.461 ************************************ 00:11:01.461 END TEST raid_read_error_test 00:11:01.461 ************************************ 00:11:01.461 07:46:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:01.461 07:46:46 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:01.461 07:46:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:01.461 07:46:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:01.461 07:46:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:01.461 ************************************ 00:11:01.461 START TEST raid_write_error_test 00:11:01.461 ************************************ 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:01.461 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ihYu2YHWfG 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1591967 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1591967 /var/tmp/spdk-raid.sock 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1591967 ']' 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:01.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:01.462 07:46:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.462 [2024-07-15 07:46:46.205803] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:11:01.462 [2024-07-15 07:46:46.205861] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591967 ] 00:11:01.722 [2024-07-15 07:46:46.295975] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:01.722 [2024-07-15 07:46:46.364229] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:01.722 [2024-07-15 07:46:46.408641] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:01.722 [2024-07-15 07:46:46.408666] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.293 07:46:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:02.293 07:46:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:02.293 07:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:02.293 07:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:02.553 BaseBdev1_malloc 00:11:02.553 07:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:02.813 true 00:11:02.813 07:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:03.073 [2024-07-15 07:46:47.595273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:03.073 [2024-07-15 07:46:47.595304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:03.073 [2024-07-15 07:46:47.595315] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18edb50 00:11:03.073 [2024-07-15 07:46:47.595322] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:03.073 [2024-07-15 07:46:47.596629] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:03.073 [2024-07-15 07:46:47.596650] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:03.073 BaseBdev1 00:11:03.073 07:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:03.073 07:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:03.073 BaseBdev2_malloc 00:11:03.073 07:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:03.333 true 00:11:03.333 07:46:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:03.593 [2024-07-15 07:46:48.154652] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:03.593 [2024-07-15 07:46:48.154682] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:03.593 [2024-07-15 07:46:48.154693] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18d1ea0 00:11:03.593 [2024-07-15 07:46:48.154700] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:03.593 [2024-07-15 07:46:48.155888] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:03.593 [2024-07-15 07:46:48.155908] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:03.593 BaseBdev2 00:11:03.593 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:03.593 [2024-07-15 07:46:48.343155] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:03.593 [2024-07-15 07:46:48.344171] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:03.593 [2024-07-15 07:46:48.344309] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x173b360 00:11:03.593 [2024-07-15 07:46:48.344317] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:03.593 [2024-07-15 07:46:48.344463] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17388a0 00:11:03.593 [2024-07-15 07:46:48.344577] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x173b360 00:11:03.593 [2024-07-15 07:46:48.344583] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x173b360 00:11:03.593 [2024-07-15 07:46:48.344657] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.869 "name": "raid_bdev1", 00:11:03.869 "uuid": "73ed1b41-f617-4cea-b7a8-8d7ddaff86fb", 00:11:03.869 "strip_size_kb": 64, 00:11:03.869 "state": "online", 00:11:03.869 "raid_level": "raid0", 00:11:03.869 "superblock": true, 00:11:03.869 "num_base_bdevs": 2, 00:11:03.869 "num_base_bdevs_discovered": 2, 00:11:03.869 "num_base_bdevs_operational": 2, 00:11:03.869 "base_bdevs_list": [ 00:11:03.869 { 00:11:03.869 "name": "BaseBdev1", 00:11:03.869 "uuid": "028e0b4d-ff1a-55e7-991a-6db3890a9631", 00:11:03.869 "is_configured": true, 00:11:03.869 "data_offset": 2048, 00:11:03.869 "data_size": 63488 00:11:03.869 }, 00:11:03.869 { 00:11:03.869 "name": "BaseBdev2", 00:11:03.869 "uuid": "249fcd6a-b790-59fd-8cea-8831cc159d16", 00:11:03.869 "is_configured": true, 00:11:03.869 "data_offset": 2048, 00:11:03.869 "data_size": 63488 00:11:03.869 } 00:11:03.869 ] 00:11:03.869 }' 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.869 07:46:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.443 07:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:04.443 07:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:04.443 [2024-07-15 07:46:49.193509] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18d3270 00:11:05.383 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:05.643 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:05.644 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:05.904 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:05.904 "name": "raid_bdev1", 00:11:05.904 "uuid": "73ed1b41-f617-4cea-b7a8-8d7ddaff86fb", 00:11:05.904 "strip_size_kb": 64, 00:11:05.904 "state": "online", 00:11:05.904 "raid_level": "raid0", 00:11:05.904 "superblock": true, 00:11:05.904 "num_base_bdevs": 2, 00:11:05.904 "num_base_bdevs_discovered": 2, 00:11:05.904 "num_base_bdevs_operational": 2, 00:11:05.904 "base_bdevs_list": [ 00:11:05.904 { 00:11:05.904 "name": "BaseBdev1", 00:11:05.904 "uuid": "028e0b4d-ff1a-55e7-991a-6db3890a9631", 00:11:05.904 "is_configured": true, 00:11:05.904 "data_offset": 2048, 00:11:05.904 "data_size": 63488 00:11:05.904 }, 00:11:05.904 { 00:11:05.904 "name": "BaseBdev2", 00:11:05.904 "uuid": "249fcd6a-b790-59fd-8cea-8831cc159d16", 00:11:05.904 "is_configured": true, 00:11:05.904 "data_offset": 2048, 00:11:05.904 "data_size": 63488 00:11:05.904 } 00:11:05.904 ] 00:11:05.904 }' 00:11:05.904 07:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:05.904 07:46:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:06.474 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:06.474 [2024-07-15 07:46:51.208956] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:06.474 [2024-07-15 07:46:51.208989] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:06.474 [2024-07-15 07:46:51.211630] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:06.474 [2024-07-15 07:46:51.211653] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:06.474 [2024-07-15 07:46:51.211672] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:06.474 [2024-07-15 07:46:51.211678] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x173b360 name raid_bdev1, state offline 00:11:06.474 0 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1591967 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1591967 ']' 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1591967 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1591967 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1591967' 00:11:06.734 killing process with pid 1591967 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1591967 00:11:06.734 [2024-07-15 07:46:51.296870] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1591967 00:11:06.734 [2024-07-15 07:46:51.302482] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ihYu2YHWfG 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:06.734 00:11:06.734 real 0m5.300s 00:11:06.734 user 0m8.331s 00:11:06.734 sys 0m0.737s 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:06.734 07:46:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:06.734 ************************************ 00:11:06.734 END TEST raid_write_error_test 00:11:06.734 ************************************ 00:11:06.734 07:46:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:06.734 07:46:51 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:06.734 07:46:51 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:06.734 07:46:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:06.734 07:46:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:06.734 07:46:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:06.994 ************************************ 00:11:06.994 START TEST raid_state_function_test 00:11:06.994 ************************************ 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1592949 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1592949' 00:11:06.994 Process raid pid: 1592949 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1592949 /var/tmp/spdk-raid.sock 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1592949 ']' 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:06.994 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:06.994 07:46:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:06.994 [2024-07-15 07:46:51.570897] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:11:06.994 [2024-07-15 07:46:51.570948] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:06.994 [2024-07-15 07:46:51.663374] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:06.994 [2024-07-15 07:46:51.741401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:07.254 [2024-07-15 07:46:51.786874] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:07.254 [2024-07-15 07:46:51.786896] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:07.823 07:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:07.823 07:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:07.823 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:08.084 [2024-07-15 07:46:52.587009] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:08.084 [2024-07-15 07:46:52.587041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:08.084 [2024-07-15 07:46:52.587047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:08.084 [2024-07-15 07:46:52.587053] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:08.084 "name": "Existed_Raid", 00:11:08.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:08.084 "strip_size_kb": 64, 00:11:08.084 "state": "configuring", 00:11:08.084 "raid_level": "concat", 00:11:08.084 "superblock": false, 00:11:08.084 "num_base_bdevs": 2, 00:11:08.084 "num_base_bdevs_discovered": 0, 00:11:08.084 "num_base_bdevs_operational": 2, 00:11:08.084 "base_bdevs_list": [ 00:11:08.084 { 00:11:08.084 "name": "BaseBdev1", 00:11:08.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:08.084 "is_configured": false, 00:11:08.084 "data_offset": 0, 00:11:08.084 "data_size": 0 00:11:08.084 }, 00:11:08.084 { 00:11:08.084 "name": "BaseBdev2", 00:11:08.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:08.084 "is_configured": false, 00:11:08.084 "data_offset": 0, 00:11:08.084 "data_size": 0 00:11:08.084 } 00:11:08.084 ] 00:11:08.084 }' 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:08.084 07:46:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.655 07:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:08.914 [2024-07-15 07:46:53.529292] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:08.914 [2024-07-15 07:46:53.529315] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25586b0 name Existed_Raid, state configuring 00:11:08.914 07:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:09.173 [2024-07-15 07:46:53.721802] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:09.173 [2024-07-15 07:46:53.721822] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:09.173 [2024-07-15 07:46:53.721828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:09.173 [2024-07-15 07:46:53.721834] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:09.173 07:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:09.173 [2024-07-15 07:46:53.924820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:09.173 BaseBdev1 00:11:09.433 07:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:09.433 07:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:09.433 07:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:09.433 07:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:09.433 07:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:09.433 07:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:09.433 07:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:09.433 07:46:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:09.693 [ 00:11:09.693 { 00:11:09.693 "name": "BaseBdev1", 00:11:09.693 "aliases": [ 00:11:09.693 "fe3ff7a3-265b-4d93-ab53-f5d69d7d7fe5" 00:11:09.693 ], 00:11:09.693 "product_name": "Malloc disk", 00:11:09.693 "block_size": 512, 00:11:09.693 "num_blocks": 65536, 00:11:09.693 "uuid": "fe3ff7a3-265b-4d93-ab53-f5d69d7d7fe5", 00:11:09.693 "assigned_rate_limits": { 00:11:09.693 "rw_ios_per_sec": 0, 00:11:09.693 "rw_mbytes_per_sec": 0, 00:11:09.693 "r_mbytes_per_sec": 0, 00:11:09.693 "w_mbytes_per_sec": 0 00:11:09.693 }, 00:11:09.693 "claimed": true, 00:11:09.693 "claim_type": "exclusive_write", 00:11:09.693 "zoned": false, 00:11:09.693 "supported_io_types": { 00:11:09.693 "read": true, 00:11:09.693 "write": true, 00:11:09.693 "unmap": true, 00:11:09.693 "flush": true, 00:11:09.693 "reset": true, 00:11:09.693 "nvme_admin": false, 00:11:09.693 "nvme_io": false, 00:11:09.693 "nvme_io_md": false, 00:11:09.693 "write_zeroes": true, 00:11:09.693 "zcopy": true, 00:11:09.693 "get_zone_info": false, 00:11:09.693 "zone_management": false, 00:11:09.693 "zone_append": false, 00:11:09.693 "compare": false, 00:11:09.693 "compare_and_write": false, 00:11:09.693 "abort": true, 00:11:09.693 "seek_hole": false, 00:11:09.693 "seek_data": false, 00:11:09.693 "copy": true, 00:11:09.693 "nvme_iov_md": false 00:11:09.693 }, 00:11:09.693 "memory_domains": [ 00:11:09.693 { 00:11:09.693 "dma_device_id": "system", 00:11:09.693 "dma_device_type": 1 00:11:09.693 }, 00:11:09.693 { 00:11:09.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:09.693 "dma_device_type": 2 00:11:09.693 } 00:11:09.693 ], 00:11:09.693 "driver_specific": {} 00:11:09.693 } 00:11:09.693 ] 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:09.693 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.953 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.953 "name": "Existed_Raid", 00:11:09.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.953 "strip_size_kb": 64, 00:11:09.953 "state": "configuring", 00:11:09.953 "raid_level": "concat", 00:11:09.953 "superblock": false, 00:11:09.953 "num_base_bdevs": 2, 00:11:09.953 "num_base_bdevs_discovered": 1, 00:11:09.953 "num_base_bdevs_operational": 2, 00:11:09.953 "base_bdevs_list": [ 00:11:09.953 { 00:11:09.953 "name": "BaseBdev1", 00:11:09.953 "uuid": "fe3ff7a3-265b-4d93-ab53-f5d69d7d7fe5", 00:11:09.953 "is_configured": true, 00:11:09.953 "data_offset": 0, 00:11:09.953 "data_size": 65536 00:11:09.953 }, 00:11:09.953 { 00:11:09.953 "name": "BaseBdev2", 00:11:09.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.953 "is_configured": false, 00:11:09.953 "data_offset": 0, 00:11:09.953 "data_size": 0 00:11:09.953 } 00:11:09.953 ] 00:11:09.953 }' 00:11:09.953 07:46:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.953 07:46:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.524 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:10.524 [2024-07-15 07:46:55.232123] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:10.524 [2024-07-15 07:46:55.232154] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2557fa0 name Existed_Raid, state configuring 00:11:10.524 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:10.785 [2024-07-15 07:46:55.392555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:10.785 [2024-07-15 07:46:55.393692] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:10.785 [2024-07-15 07:46:55.393727] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.785 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:11.046 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:11.046 "name": "Existed_Raid", 00:11:11.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:11.046 "strip_size_kb": 64, 00:11:11.046 "state": "configuring", 00:11:11.046 "raid_level": "concat", 00:11:11.046 "superblock": false, 00:11:11.046 "num_base_bdevs": 2, 00:11:11.046 "num_base_bdevs_discovered": 1, 00:11:11.046 "num_base_bdevs_operational": 2, 00:11:11.046 "base_bdevs_list": [ 00:11:11.046 { 00:11:11.046 "name": "BaseBdev1", 00:11:11.046 "uuid": "fe3ff7a3-265b-4d93-ab53-f5d69d7d7fe5", 00:11:11.046 "is_configured": true, 00:11:11.046 "data_offset": 0, 00:11:11.046 "data_size": 65536 00:11:11.046 }, 00:11:11.046 { 00:11:11.046 "name": "BaseBdev2", 00:11:11.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:11.046 "is_configured": false, 00:11:11.046 "data_offset": 0, 00:11:11.046 "data_size": 0 00:11:11.046 } 00:11:11.046 ] 00:11:11.046 }' 00:11:11.046 07:46:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:11.046 07:46:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.617 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:11.617 [2024-07-15 07:46:56.335785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:11.617 [2024-07-15 07:46:56.335808] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2558d90 00:11:11.617 [2024-07-15 07:46:56.335813] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:11.617 [2024-07-15 07:46:56.335959] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26fc730 00:11:11.617 [2024-07-15 07:46:56.336048] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2558d90 00:11:11.617 [2024-07-15 07:46:56.336054] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2558d90 00:11:11.617 [2024-07-15 07:46:56.336176] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:11.617 BaseBdev2 00:11:11.617 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:11.617 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:11.617 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:11.617 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:11.617 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:11.617 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:11.617 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:11.878 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:12.137 [ 00:11:12.137 { 00:11:12.137 "name": "BaseBdev2", 00:11:12.137 "aliases": [ 00:11:12.137 "728f0ca2-cb45-46b3-bcb7-6d2f83edb2d8" 00:11:12.137 ], 00:11:12.137 "product_name": "Malloc disk", 00:11:12.137 "block_size": 512, 00:11:12.137 "num_blocks": 65536, 00:11:12.137 "uuid": "728f0ca2-cb45-46b3-bcb7-6d2f83edb2d8", 00:11:12.137 "assigned_rate_limits": { 00:11:12.137 "rw_ios_per_sec": 0, 00:11:12.137 "rw_mbytes_per_sec": 0, 00:11:12.137 "r_mbytes_per_sec": 0, 00:11:12.137 "w_mbytes_per_sec": 0 00:11:12.137 }, 00:11:12.137 "claimed": true, 00:11:12.137 "claim_type": "exclusive_write", 00:11:12.137 "zoned": false, 00:11:12.137 "supported_io_types": { 00:11:12.137 "read": true, 00:11:12.137 "write": true, 00:11:12.137 "unmap": true, 00:11:12.137 "flush": true, 00:11:12.138 "reset": true, 00:11:12.138 "nvme_admin": false, 00:11:12.138 "nvme_io": false, 00:11:12.138 "nvme_io_md": false, 00:11:12.138 "write_zeroes": true, 00:11:12.138 "zcopy": true, 00:11:12.138 "get_zone_info": false, 00:11:12.138 "zone_management": false, 00:11:12.138 "zone_append": false, 00:11:12.138 "compare": false, 00:11:12.138 "compare_and_write": false, 00:11:12.138 "abort": true, 00:11:12.138 "seek_hole": false, 00:11:12.138 "seek_data": false, 00:11:12.138 "copy": true, 00:11:12.138 "nvme_iov_md": false 00:11:12.138 }, 00:11:12.138 "memory_domains": [ 00:11:12.138 { 00:11:12.138 "dma_device_id": "system", 00:11:12.138 "dma_device_type": 1 00:11:12.138 }, 00:11:12.138 { 00:11:12.138 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.138 "dma_device_type": 2 00:11:12.138 } 00:11:12.138 ], 00:11:12.138 "driver_specific": {} 00:11:12.138 } 00:11:12.138 ] 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.138 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:12.399 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:12.399 "name": "Existed_Raid", 00:11:12.399 "uuid": "4a5548f3-7b6f-40d5-9557-199e900bfeee", 00:11:12.399 "strip_size_kb": 64, 00:11:12.399 "state": "online", 00:11:12.399 "raid_level": "concat", 00:11:12.399 "superblock": false, 00:11:12.399 "num_base_bdevs": 2, 00:11:12.399 "num_base_bdevs_discovered": 2, 00:11:12.399 "num_base_bdevs_operational": 2, 00:11:12.399 "base_bdevs_list": [ 00:11:12.399 { 00:11:12.399 "name": "BaseBdev1", 00:11:12.399 "uuid": "fe3ff7a3-265b-4d93-ab53-f5d69d7d7fe5", 00:11:12.399 "is_configured": true, 00:11:12.399 "data_offset": 0, 00:11:12.399 "data_size": 65536 00:11:12.399 }, 00:11:12.399 { 00:11:12.399 "name": "BaseBdev2", 00:11:12.399 "uuid": "728f0ca2-cb45-46b3-bcb7-6d2f83edb2d8", 00:11:12.399 "is_configured": true, 00:11:12.399 "data_offset": 0, 00:11:12.399 "data_size": 65536 00:11:12.399 } 00:11:12.399 ] 00:11:12.399 }' 00:11:12.399 07:46:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:12.399 07:46:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:12.971 [2024-07-15 07:46:57.655335] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:12.971 "name": "Existed_Raid", 00:11:12.971 "aliases": [ 00:11:12.971 "4a5548f3-7b6f-40d5-9557-199e900bfeee" 00:11:12.971 ], 00:11:12.971 "product_name": "Raid Volume", 00:11:12.971 "block_size": 512, 00:11:12.971 "num_blocks": 131072, 00:11:12.971 "uuid": "4a5548f3-7b6f-40d5-9557-199e900bfeee", 00:11:12.971 "assigned_rate_limits": { 00:11:12.971 "rw_ios_per_sec": 0, 00:11:12.971 "rw_mbytes_per_sec": 0, 00:11:12.971 "r_mbytes_per_sec": 0, 00:11:12.971 "w_mbytes_per_sec": 0 00:11:12.971 }, 00:11:12.971 "claimed": false, 00:11:12.971 "zoned": false, 00:11:12.971 "supported_io_types": { 00:11:12.971 "read": true, 00:11:12.971 "write": true, 00:11:12.971 "unmap": true, 00:11:12.971 "flush": true, 00:11:12.971 "reset": true, 00:11:12.971 "nvme_admin": false, 00:11:12.971 "nvme_io": false, 00:11:12.971 "nvme_io_md": false, 00:11:12.971 "write_zeroes": true, 00:11:12.971 "zcopy": false, 00:11:12.971 "get_zone_info": false, 00:11:12.971 "zone_management": false, 00:11:12.971 "zone_append": false, 00:11:12.971 "compare": false, 00:11:12.971 "compare_and_write": false, 00:11:12.971 "abort": false, 00:11:12.971 "seek_hole": false, 00:11:12.971 "seek_data": false, 00:11:12.971 "copy": false, 00:11:12.971 "nvme_iov_md": false 00:11:12.971 }, 00:11:12.971 "memory_domains": [ 00:11:12.971 { 00:11:12.971 "dma_device_id": "system", 00:11:12.971 "dma_device_type": 1 00:11:12.971 }, 00:11:12.971 { 00:11:12.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.971 "dma_device_type": 2 00:11:12.971 }, 00:11:12.971 { 00:11:12.971 "dma_device_id": "system", 00:11:12.971 "dma_device_type": 1 00:11:12.971 }, 00:11:12.971 { 00:11:12.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.971 "dma_device_type": 2 00:11:12.971 } 00:11:12.971 ], 00:11:12.971 "driver_specific": { 00:11:12.971 "raid": { 00:11:12.971 "uuid": "4a5548f3-7b6f-40d5-9557-199e900bfeee", 00:11:12.971 "strip_size_kb": 64, 00:11:12.971 "state": "online", 00:11:12.971 "raid_level": "concat", 00:11:12.971 "superblock": false, 00:11:12.971 "num_base_bdevs": 2, 00:11:12.971 "num_base_bdevs_discovered": 2, 00:11:12.971 "num_base_bdevs_operational": 2, 00:11:12.971 "base_bdevs_list": [ 00:11:12.971 { 00:11:12.971 "name": "BaseBdev1", 00:11:12.971 "uuid": "fe3ff7a3-265b-4d93-ab53-f5d69d7d7fe5", 00:11:12.971 "is_configured": true, 00:11:12.971 "data_offset": 0, 00:11:12.971 "data_size": 65536 00:11:12.971 }, 00:11:12.971 { 00:11:12.971 "name": "BaseBdev2", 00:11:12.971 "uuid": "728f0ca2-cb45-46b3-bcb7-6d2f83edb2d8", 00:11:12.971 "is_configured": true, 00:11:12.971 "data_offset": 0, 00:11:12.971 "data_size": 65536 00:11:12.971 } 00:11:12.971 ] 00:11:12.971 } 00:11:12.971 } 00:11:12.971 }' 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:12.971 BaseBdev2' 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:12.971 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:13.232 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:13.232 "name": "BaseBdev1", 00:11:13.232 "aliases": [ 00:11:13.232 "fe3ff7a3-265b-4d93-ab53-f5d69d7d7fe5" 00:11:13.232 ], 00:11:13.232 "product_name": "Malloc disk", 00:11:13.232 "block_size": 512, 00:11:13.232 "num_blocks": 65536, 00:11:13.232 "uuid": "fe3ff7a3-265b-4d93-ab53-f5d69d7d7fe5", 00:11:13.232 "assigned_rate_limits": { 00:11:13.232 "rw_ios_per_sec": 0, 00:11:13.232 "rw_mbytes_per_sec": 0, 00:11:13.232 "r_mbytes_per_sec": 0, 00:11:13.232 "w_mbytes_per_sec": 0 00:11:13.232 }, 00:11:13.232 "claimed": true, 00:11:13.232 "claim_type": "exclusive_write", 00:11:13.232 "zoned": false, 00:11:13.232 "supported_io_types": { 00:11:13.232 "read": true, 00:11:13.232 "write": true, 00:11:13.232 "unmap": true, 00:11:13.232 "flush": true, 00:11:13.232 "reset": true, 00:11:13.232 "nvme_admin": false, 00:11:13.232 "nvme_io": false, 00:11:13.232 "nvme_io_md": false, 00:11:13.232 "write_zeroes": true, 00:11:13.232 "zcopy": true, 00:11:13.232 "get_zone_info": false, 00:11:13.232 "zone_management": false, 00:11:13.232 "zone_append": false, 00:11:13.232 "compare": false, 00:11:13.232 "compare_and_write": false, 00:11:13.232 "abort": true, 00:11:13.232 "seek_hole": false, 00:11:13.232 "seek_data": false, 00:11:13.232 "copy": true, 00:11:13.232 "nvme_iov_md": false 00:11:13.232 }, 00:11:13.232 "memory_domains": [ 00:11:13.232 { 00:11:13.232 "dma_device_id": "system", 00:11:13.232 "dma_device_type": 1 00:11:13.232 }, 00:11:13.232 { 00:11:13.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.232 "dma_device_type": 2 00:11:13.232 } 00:11:13.232 ], 00:11:13.232 "driver_specific": {} 00:11:13.232 }' 00:11:13.232 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:13.232 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:13.492 07:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:13.492 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:13.492 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:13.492 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:13.492 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:13.492 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:13.492 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:13.492 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:13.492 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:13.752 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:13.752 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:13.752 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:13.752 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:13.752 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:13.752 "name": "BaseBdev2", 00:11:13.752 "aliases": [ 00:11:13.752 "728f0ca2-cb45-46b3-bcb7-6d2f83edb2d8" 00:11:13.752 ], 00:11:13.752 "product_name": "Malloc disk", 00:11:13.752 "block_size": 512, 00:11:13.752 "num_blocks": 65536, 00:11:13.752 "uuid": "728f0ca2-cb45-46b3-bcb7-6d2f83edb2d8", 00:11:13.752 "assigned_rate_limits": { 00:11:13.752 "rw_ios_per_sec": 0, 00:11:13.752 "rw_mbytes_per_sec": 0, 00:11:13.752 "r_mbytes_per_sec": 0, 00:11:13.752 "w_mbytes_per_sec": 0 00:11:13.752 }, 00:11:13.752 "claimed": true, 00:11:13.752 "claim_type": "exclusive_write", 00:11:13.752 "zoned": false, 00:11:13.752 "supported_io_types": { 00:11:13.752 "read": true, 00:11:13.752 "write": true, 00:11:13.752 "unmap": true, 00:11:13.752 "flush": true, 00:11:13.752 "reset": true, 00:11:13.752 "nvme_admin": false, 00:11:13.752 "nvme_io": false, 00:11:13.752 "nvme_io_md": false, 00:11:13.752 "write_zeroes": true, 00:11:13.752 "zcopy": true, 00:11:13.752 "get_zone_info": false, 00:11:13.752 "zone_management": false, 00:11:13.752 "zone_append": false, 00:11:13.752 "compare": false, 00:11:13.752 "compare_and_write": false, 00:11:13.752 "abort": true, 00:11:13.752 "seek_hole": false, 00:11:13.753 "seek_data": false, 00:11:13.753 "copy": true, 00:11:13.753 "nvme_iov_md": false 00:11:13.753 }, 00:11:13.753 "memory_domains": [ 00:11:13.753 { 00:11:13.753 "dma_device_id": "system", 00:11:13.753 "dma_device_type": 1 00:11:13.753 }, 00:11:13.753 { 00:11:13.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.753 "dma_device_type": 2 00:11:13.753 } 00:11:13.753 ], 00:11:13.753 "driver_specific": {} 00:11:13.753 }' 00:11:13.753 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:13.753 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.013 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:14.274 [2024-07-15 07:46:58.970489] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:14.274 [2024-07-15 07:46:58.970507] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:14.274 [2024-07-15 07:46:58.970537] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.274 07:46:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:14.534 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:14.534 "name": "Existed_Raid", 00:11:14.534 "uuid": "4a5548f3-7b6f-40d5-9557-199e900bfeee", 00:11:14.534 "strip_size_kb": 64, 00:11:14.534 "state": "offline", 00:11:14.534 "raid_level": "concat", 00:11:14.534 "superblock": false, 00:11:14.534 "num_base_bdevs": 2, 00:11:14.534 "num_base_bdevs_discovered": 1, 00:11:14.534 "num_base_bdevs_operational": 1, 00:11:14.534 "base_bdevs_list": [ 00:11:14.534 { 00:11:14.534 "name": null, 00:11:14.534 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:14.534 "is_configured": false, 00:11:14.534 "data_offset": 0, 00:11:14.534 "data_size": 65536 00:11:14.534 }, 00:11:14.534 { 00:11:14.534 "name": "BaseBdev2", 00:11:14.534 "uuid": "728f0ca2-cb45-46b3-bcb7-6d2f83edb2d8", 00:11:14.534 "is_configured": true, 00:11:14.534 "data_offset": 0, 00:11:14.534 "data_size": 65536 00:11:14.534 } 00:11:14.534 ] 00:11:14.534 }' 00:11:14.534 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:14.534 07:46:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.105 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:15.105 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:15.105 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.105 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:15.365 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:15.365 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:15.365 07:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:15.365 [2024-07-15 07:47:00.089390] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:15.365 [2024-07-15 07:47:00.089428] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2558d90 name Existed_Raid, state offline 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1592949 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1592949 ']' 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1592949 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1592949 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1592949' 00:11:15.625 killing process with pid 1592949 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1592949 00:11:15.625 [2024-07-15 07:47:00.373132] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:15.625 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1592949 00:11:15.625 [2024-07-15 07:47:00.373720] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:15.887 00:11:15.887 real 0m8.989s 00:11:15.887 user 0m16.308s 00:11:15.887 sys 0m1.371s 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.887 ************************************ 00:11:15.887 END TEST raid_state_function_test 00:11:15.887 ************************************ 00:11:15.887 07:47:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:15.887 07:47:00 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:15.887 07:47:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:15.887 07:47:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:15.887 07:47:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:15.887 ************************************ 00:11:15.887 START TEST raid_state_function_test_sb 00:11:15.887 ************************************ 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1594695 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1594695' 00:11:15.887 Process raid pid: 1594695 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1594695 /var/tmp/spdk-raid.sock 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1594695 ']' 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:15.887 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:15.887 07:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:15.887 [2024-07-15 07:47:00.630659] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:11:15.887 [2024-07-15 07:47:00.630707] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:16.149 [2024-07-15 07:47:00.719498] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:16.149 [2024-07-15 07:47:00.796564] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:16.149 [2024-07-15 07:47:00.844039] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:16.149 [2024-07-15 07:47:00.844064] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:17.090 [2024-07-15 07:47:01.648142] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:17.090 [2024-07-15 07:47:01.648172] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:17.090 [2024-07-15 07:47:01.648178] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:17.090 [2024-07-15 07:47:01.648184] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.090 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.351 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.351 "name": "Existed_Raid", 00:11:17.351 "uuid": "e758ab8c-ccea-46d2-8091-2250ffb04234", 00:11:17.351 "strip_size_kb": 64, 00:11:17.351 "state": "configuring", 00:11:17.351 "raid_level": "concat", 00:11:17.351 "superblock": true, 00:11:17.351 "num_base_bdevs": 2, 00:11:17.351 "num_base_bdevs_discovered": 0, 00:11:17.351 "num_base_bdevs_operational": 2, 00:11:17.351 "base_bdevs_list": [ 00:11:17.351 { 00:11:17.351 "name": "BaseBdev1", 00:11:17.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.351 "is_configured": false, 00:11:17.351 "data_offset": 0, 00:11:17.351 "data_size": 0 00:11:17.351 }, 00:11:17.351 { 00:11:17.351 "name": "BaseBdev2", 00:11:17.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.351 "is_configured": false, 00:11:17.351 "data_offset": 0, 00:11:17.351 "data_size": 0 00:11:17.351 } 00:11:17.351 ] 00:11:17.351 }' 00:11:17.351 07:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.351 07:47:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:17.919 07:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:17.919 [2024-07-15 07:47:02.606456] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:17.919 [2024-07-15 07:47:02.606475] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21436b0 name Existed_Raid, state configuring 00:11:17.919 07:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:18.179 [2024-07-15 07:47:02.782917] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:18.180 [2024-07-15 07:47:02.782933] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:18.180 [2024-07-15 07:47:02.782939] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:18.180 [2024-07-15 07:47:02.782944] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:18.180 07:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:18.440 [2024-07-15 07:47:02.982238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:18.440 BaseBdev1 00:11:18.440 07:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:18.440 07:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:18.440 07:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:18.440 07:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:18.440 07:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:18.440 07:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:18.440 07:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:18.440 07:47:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:18.699 [ 00:11:18.699 { 00:11:18.699 "name": "BaseBdev1", 00:11:18.699 "aliases": [ 00:11:18.699 "0d17df92-769b-467e-97a2-81e8e132d8d2" 00:11:18.699 ], 00:11:18.699 "product_name": "Malloc disk", 00:11:18.700 "block_size": 512, 00:11:18.700 "num_blocks": 65536, 00:11:18.700 "uuid": "0d17df92-769b-467e-97a2-81e8e132d8d2", 00:11:18.700 "assigned_rate_limits": { 00:11:18.700 "rw_ios_per_sec": 0, 00:11:18.700 "rw_mbytes_per_sec": 0, 00:11:18.700 "r_mbytes_per_sec": 0, 00:11:18.700 "w_mbytes_per_sec": 0 00:11:18.700 }, 00:11:18.700 "claimed": true, 00:11:18.700 "claim_type": "exclusive_write", 00:11:18.700 "zoned": false, 00:11:18.700 "supported_io_types": { 00:11:18.700 "read": true, 00:11:18.700 "write": true, 00:11:18.700 "unmap": true, 00:11:18.700 "flush": true, 00:11:18.700 "reset": true, 00:11:18.700 "nvme_admin": false, 00:11:18.700 "nvme_io": false, 00:11:18.700 "nvme_io_md": false, 00:11:18.700 "write_zeroes": true, 00:11:18.700 "zcopy": true, 00:11:18.700 "get_zone_info": false, 00:11:18.700 "zone_management": false, 00:11:18.700 "zone_append": false, 00:11:18.700 "compare": false, 00:11:18.700 "compare_and_write": false, 00:11:18.700 "abort": true, 00:11:18.700 "seek_hole": false, 00:11:18.700 "seek_data": false, 00:11:18.700 "copy": true, 00:11:18.700 "nvme_iov_md": false 00:11:18.700 }, 00:11:18.700 "memory_domains": [ 00:11:18.700 { 00:11:18.700 "dma_device_id": "system", 00:11:18.700 "dma_device_type": 1 00:11:18.700 }, 00:11:18.700 { 00:11:18.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.700 "dma_device_type": 2 00:11:18.700 } 00:11:18.700 ], 00:11:18.700 "driver_specific": {} 00:11:18.700 } 00:11:18.700 ] 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.700 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.972 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.972 "name": "Existed_Raid", 00:11:18.972 "uuid": "cef26755-17ca-43e6-a7c6-c8d2b28b6b03", 00:11:18.972 "strip_size_kb": 64, 00:11:18.972 "state": "configuring", 00:11:18.972 "raid_level": "concat", 00:11:18.972 "superblock": true, 00:11:18.972 "num_base_bdevs": 2, 00:11:18.972 "num_base_bdevs_discovered": 1, 00:11:18.972 "num_base_bdevs_operational": 2, 00:11:18.972 "base_bdevs_list": [ 00:11:18.972 { 00:11:18.972 "name": "BaseBdev1", 00:11:18.972 "uuid": "0d17df92-769b-467e-97a2-81e8e132d8d2", 00:11:18.972 "is_configured": true, 00:11:18.972 "data_offset": 2048, 00:11:18.972 "data_size": 63488 00:11:18.972 }, 00:11:18.972 { 00:11:18.972 "name": "BaseBdev2", 00:11:18.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.972 "is_configured": false, 00:11:18.972 "data_offset": 0, 00:11:18.972 "data_size": 0 00:11:18.972 } 00:11:18.972 ] 00:11:18.972 }' 00:11:18.972 07:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.972 07:47:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:19.592 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:19.592 [2024-07-15 07:47:04.265469] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:19.592 [2024-07-15 07:47:04.265496] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2142fa0 name Existed_Raid, state configuring 00:11:19.593 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:19.854 [2024-07-15 07:47:04.462004] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:19.854 [2024-07-15 07:47:04.463141] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:19.854 [2024-07-15 07:47:04.463165] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:19.854 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:20.115 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.115 "name": "Existed_Raid", 00:11:20.115 "uuid": "e8f79651-2918-44a1-8087-fe8ebb041230", 00:11:20.115 "strip_size_kb": 64, 00:11:20.115 "state": "configuring", 00:11:20.115 "raid_level": "concat", 00:11:20.115 "superblock": true, 00:11:20.115 "num_base_bdevs": 2, 00:11:20.115 "num_base_bdevs_discovered": 1, 00:11:20.115 "num_base_bdevs_operational": 2, 00:11:20.115 "base_bdevs_list": [ 00:11:20.115 { 00:11:20.115 "name": "BaseBdev1", 00:11:20.115 "uuid": "0d17df92-769b-467e-97a2-81e8e132d8d2", 00:11:20.115 "is_configured": true, 00:11:20.115 "data_offset": 2048, 00:11:20.115 "data_size": 63488 00:11:20.115 }, 00:11:20.115 { 00:11:20.115 "name": "BaseBdev2", 00:11:20.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.115 "is_configured": false, 00:11:20.115 "data_offset": 0, 00:11:20.115 "data_size": 0 00:11:20.115 } 00:11:20.115 ] 00:11:20.115 }' 00:11:20.115 07:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.115 07:47:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:20.686 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:20.686 [2024-07-15 07:47:05.409227] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:20.686 [2024-07-15 07:47:05.409335] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2143d90 00:11:20.686 [2024-07-15 07:47:05.409343] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:20.686 [2024-07-15 07:47:05.409483] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22f7780 00:11:20.686 [2024-07-15 07:47:05.409570] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2143d90 00:11:20.686 [2024-07-15 07:47:05.409576] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2143d90 00:11:20.686 [2024-07-15 07:47:05.409641] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:20.686 BaseBdev2 00:11:20.686 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:20.686 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:20.686 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:20.686 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:20.686 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:20.686 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:20.686 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:20.947 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:21.208 [ 00:11:21.208 { 00:11:21.208 "name": "BaseBdev2", 00:11:21.208 "aliases": [ 00:11:21.208 "55a5adf2-5bf5-4a44-812b-d660ae2de747" 00:11:21.208 ], 00:11:21.208 "product_name": "Malloc disk", 00:11:21.208 "block_size": 512, 00:11:21.208 "num_blocks": 65536, 00:11:21.208 "uuid": "55a5adf2-5bf5-4a44-812b-d660ae2de747", 00:11:21.208 "assigned_rate_limits": { 00:11:21.208 "rw_ios_per_sec": 0, 00:11:21.208 "rw_mbytes_per_sec": 0, 00:11:21.208 "r_mbytes_per_sec": 0, 00:11:21.208 "w_mbytes_per_sec": 0 00:11:21.208 }, 00:11:21.208 "claimed": true, 00:11:21.208 "claim_type": "exclusive_write", 00:11:21.208 "zoned": false, 00:11:21.208 "supported_io_types": { 00:11:21.208 "read": true, 00:11:21.208 "write": true, 00:11:21.208 "unmap": true, 00:11:21.208 "flush": true, 00:11:21.208 "reset": true, 00:11:21.208 "nvme_admin": false, 00:11:21.208 "nvme_io": false, 00:11:21.208 "nvme_io_md": false, 00:11:21.208 "write_zeroes": true, 00:11:21.208 "zcopy": true, 00:11:21.208 "get_zone_info": false, 00:11:21.208 "zone_management": false, 00:11:21.208 "zone_append": false, 00:11:21.208 "compare": false, 00:11:21.208 "compare_and_write": false, 00:11:21.208 "abort": true, 00:11:21.208 "seek_hole": false, 00:11:21.208 "seek_data": false, 00:11:21.208 "copy": true, 00:11:21.208 "nvme_iov_md": false 00:11:21.208 }, 00:11:21.208 "memory_domains": [ 00:11:21.208 { 00:11:21.208 "dma_device_id": "system", 00:11:21.208 "dma_device_type": 1 00:11:21.208 }, 00:11:21.208 { 00:11:21.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.208 "dma_device_type": 2 00:11:21.208 } 00:11:21.208 ], 00:11:21.208 "driver_specific": {} 00:11:21.208 } 00:11:21.208 ] 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.208 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:21.468 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:21.468 "name": "Existed_Raid", 00:11:21.468 "uuid": "e8f79651-2918-44a1-8087-fe8ebb041230", 00:11:21.468 "strip_size_kb": 64, 00:11:21.468 "state": "online", 00:11:21.468 "raid_level": "concat", 00:11:21.468 "superblock": true, 00:11:21.468 "num_base_bdevs": 2, 00:11:21.468 "num_base_bdevs_discovered": 2, 00:11:21.468 "num_base_bdevs_operational": 2, 00:11:21.468 "base_bdevs_list": [ 00:11:21.468 { 00:11:21.468 "name": "BaseBdev1", 00:11:21.468 "uuid": "0d17df92-769b-467e-97a2-81e8e132d8d2", 00:11:21.468 "is_configured": true, 00:11:21.468 "data_offset": 2048, 00:11:21.468 "data_size": 63488 00:11:21.468 }, 00:11:21.468 { 00:11:21.468 "name": "BaseBdev2", 00:11:21.468 "uuid": "55a5adf2-5bf5-4a44-812b-d660ae2de747", 00:11:21.468 "is_configured": true, 00:11:21.468 "data_offset": 2048, 00:11:21.468 "data_size": 63488 00:11:21.468 } 00:11:21.468 ] 00:11:21.468 }' 00:11:21.468 07:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:21.468 07:47:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:22.040 [2024-07-15 07:47:06.696720] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:22.040 "name": "Existed_Raid", 00:11:22.040 "aliases": [ 00:11:22.040 "e8f79651-2918-44a1-8087-fe8ebb041230" 00:11:22.040 ], 00:11:22.040 "product_name": "Raid Volume", 00:11:22.040 "block_size": 512, 00:11:22.040 "num_blocks": 126976, 00:11:22.040 "uuid": "e8f79651-2918-44a1-8087-fe8ebb041230", 00:11:22.040 "assigned_rate_limits": { 00:11:22.040 "rw_ios_per_sec": 0, 00:11:22.040 "rw_mbytes_per_sec": 0, 00:11:22.040 "r_mbytes_per_sec": 0, 00:11:22.040 "w_mbytes_per_sec": 0 00:11:22.040 }, 00:11:22.040 "claimed": false, 00:11:22.040 "zoned": false, 00:11:22.040 "supported_io_types": { 00:11:22.040 "read": true, 00:11:22.040 "write": true, 00:11:22.040 "unmap": true, 00:11:22.040 "flush": true, 00:11:22.040 "reset": true, 00:11:22.040 "nvme_admin": false, 00:11:22.040 "nvme_io": false, 00:11:22.040 "nvme_io_md": false, 00:11:22.040 "write_zeroes": true, 00:11:22.040 "zcopy": false, 00:11:22.040 "get_zone_info": false, 00:11:22.040 "zone_management": false, 00:11:22.040 "zone_append": false, 00:11:22.040 "compare": false, 00:11:22.040 "compare_and_write": false, 00:11:22.040 "abort": false, 00:11:22.040 "seek_hole": false, 00:11:22.040 "seek_data": false, 00:11:22.040 "copy": false, 00:11:22.040 "nvme_iov_md": false 00:11:22.040 }, 00:11:22.040 "memory_domains": [ 00:11:22.040 { 00:11:22.040 "dma_device_id": "system", 00:11:22.040 "dma_device_type": 1 00:11:22.040 }, 00:11:22.040 { 00:11:22.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.040 "dma_device_type": 2 00:11:22.040 }, 00:11:22.040 { 00:11:22.040 "dma_device_id": "system", 00:11:22.040 "dma_device_type": 1 00:11:22.040 }, 00:11:22.040 { 00:11:22.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.040 "dma_device_type": 2 00:11:22.040 } 00:11:22.040 ], 00:11:22.040 "driver_specific": { 00:11:22.040 "raid": { 00:11:22.040 "uuid": "e8f79651-2918-44a1-8087-fe8ebb041230", 00:11:22.040 "strip_size_kb": 64, 00:11:22.040 "state": "online", 00:11:22.040 "raid_level": "concat", 00:11:22.040 "superblock": true, 00:11:22.040 "num_base_bdevs": 2, 00:11:22.040 "num_base_bdevs_discovered": 2, 00:11:22.040 "num_base_bdevs_operational": 2, 00:11:22.040 "base_bdevs_list": [ 00:11:22.040 { 00:11:22.040 "name": "BaseBdev1", 00:11:22.040 "uuid": "0d17df92-769b-467e-97a2-81e8e132d8d2", 00:11:22.040 "is_configured": true, 00:11:22.040 "data_offset": 2048, 00:11:22.040 "data_size": 63488 00:11:22.040 }, 00:11:22.040 { 00:11:22.040 "name": "BaseBdev2", 00:11:22.040 "uuid": "55a5adf2-5bf5-4a44-812b-d660ae2de747", 00:11:22.040 "is_configured": true, 00:11:22.040 "data_offset": 2048, 00:11:22.040 "data_size": 63488 00:11:22.040 } 00:11:22.040 ] 00:11:22.040 } 00:11:22.040 } 00:11:22.040 }' 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:22.040 BaseBdev2' 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:22.040 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:22.302 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:22.302 "name": "BaseBdev1", 00:11:22.302 "aliases": [ 00:11:22.302 "0d17df92-769b-467e-97a2-81e8e132d8d2" 00:11:22.302 ], 00:11:22.302 "product_name": "Malloc disk", 00:11:22.302 "block_size": 512, 00:11:22.302 "num_blocks": 65536, 00:11:22.302 "uuid": "0d17df92-769b-467e-97a2-81e8e132d8d2", 00:11:22.302 "assigned_rate_limits": { 00:11:22.302 "rw_ios_per_sec": 0, 00:11:22.302 "rw_mbytes_per_sec": 0, 00:11:22.302 "r_mbytes_per_sec": 0, 00:11:22.302 "w_mbytes_per_sec": 0 00:11:22.302 }, 00:11:22.302 "claimed": true, 00:11:22.302 "claim_type": "exclusive_write", 00:11:22.302 "zoned": false, 00:11:22.302 "supported_io_types": { 00:11:22.302 "read": true, 00:11:22.302 "write": true, 00:11:22.302 "unmap": true, 00:11:22.302 "flush": true, 00:11:22.302 "reset": true, 00:11:22.302 "nvme_admin": false, 00:11:22.302 "nvme_io": false, 00:11:22.302 "nvme_io_md": false, 00:11:22.302 "write_zeroes": true, 00:11:22.302 "zcopy": true, 00:11:22.302 "get_zone_info": false, 00:11:22.302 "zone_management": false, 00:11:22.302 "zone_append": false, 00:11:22.302 "compare": false, 00:11:22.302 "compare_and_write": false, 00:11:22.302 "abort": true, 00:11:22.302 "seek_hole": false, 00:11:22.302 "seek_data": false, 00:11:22.302 "copy": true, 00:11:22.302 "nvme_iov_md": false 00:11:22.302 }, 00:11:22.302 "memory_domains": [ 00:11:22.302 { 00:11:22.302 "dma_device_id": "system", 00:11:22.302 "dma_device_type": 1 00:11:22.302 }, 00:11:22.302 { 00:11:22.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.302 "dma_device_type": 2 00:11:22.302 } 00:11:22.302 ], 00:11:22.302 "driver_specific": {} 00:11:22.302 }' 00:11:22.302 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:22.302 07:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:22.302 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:22.302 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:22.563 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:22.824 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:22.824 "name": "BaseBdev2", 00:11:22.824 "aliases": [ 00:11:22.824 "55a5adf2-5bf5-4a44-812b-d660ae2de747" 00:11:22.824 ], 00:11:22.824 "product_name": "Malloc disk", 00:11:22.824 "block_size": 512, 00:11:22.824 "num_blocks": 65536, 00:11:22.824 "uuid": "55a5adf2-5bf5-4a44-812b-d660ae2de747", 00:11:22.824 "assigned_rate_limits": { 00:11:22.824 "rw_ios_per_sec": 0, 00:11:22.824 "rw_mbytes_per_sec": 0, 00:11:22.824 "r_mbytes_per_sec": 0, 00:11:22.824 "w_mbytes_per_sec": 0 00:11:22.824 }, 00:11:22.824 "claimed": true, 00:11:22.824 "claim_type": "exclusive_write", 00:11:22.824 "zoned": false, 00:11:22.824 "supported_io_types": { 00:11:22.824 "read": true, 00:11:22.824 "write": true, 00:11:22.824 "unmap": true, 00:11:22.824 "flush": true, 00:11:22.824 "reset": true, 00:11:22.824 "nvme_admin": false, 00:11:22.824 "nvme_io": false, 00:11:22.824 "nvme_io_md": false, 00:11:22.824 "write_zeroes": true, 00:11:22.824 "zcopy": true, 00:11:22.824 "get_zone_info": false, 00:11:22.824 "zone_management": false, 00:11:22.824 "zone_append": false, 00:11:22.824 "compare": false, 00:11:22.824 "compare_and_write": false, 00:11:22.824 "abort": true, 00:11:22.824 "seek_hole": false, 00:11:22.824 "seek_data": false, 00:11:22.824 "copy": true, 00:11:22.824 "nvme_iov_md": false 00:11:22.824 }, 00:11:22.824 "memory_domains": [ 00:11:22.824 { 00:11:22.824 "dma_device_id": "system", 00:11:22.824 "dma_device_type": 1 00:11:22.824 }, 00:11:22.824 { 00:11:22.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.824 "dma_device_type": 2 00:11:22.824 } 00:11:22.824 ], 00:11:22.824 "driver_specific": {} 00:11:22.824 }' 00:11:22.824 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:22.824 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:22.824 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:22.824 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:23.084 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:23.084 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:23.084 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:23.084 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:23.084 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:23.084 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:23.084 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:23.349 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:23.349 07:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:23.349 [2024-07-15 07:47:08.019884] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:23.349 [2024-07-15 07:47:08.019907] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:23.349 [2024-07-15 07:47:08.019937] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.349 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:23.611 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:23.611 "name": "Existed_Raid", 00:11:23.611 "uuid": "e8f79651-2918-44a1-8087-fe8ebb041230", 00:11:23.611 "strip_size_kb": 64, 00:11:23.611 "state": "offline", 00:11:23.611 "raid_level": "concat", 00:11:23.611 "superblock": true, 00:11:23.611 "num_base_bdevs": 2, 00:11:23.611 "num_base_bdevs_discovered": 1, 00:11:23.611 "num_base_bdevs_operational": 1, 00:11:23.611 "base_bdevs_list": [ 00:11:23.611 { 00:11:23.611 "name": null, 00:11:23.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:23.611 "is_configured": false, 00:11:23.611 "data_offset": 2048, 00:11:23.611 "data_size": 63488 00:11:23.611 }, 00:11:23.611 { 00:11:23.611 "name": "BaseBdev2", 00:11:23.611 "uuid": "55a5adf2-5bf5-4a44-812b-d660ae2de747", 00:11:23.611 "is_configured": true, 00:11:23.611 "data_offset": 2048, 00:11:23.611 "data_size": 63488 00:11:23.611 } 00:11:23.611 ] 00:11:23.611 }' 00:11:23.611 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:23.611 07:47:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:24.179 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:24.180 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:24.180 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.180 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:24.440 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:24.440 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:24.440 07:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:25.012 [2024-07-15 07:47:09.475564] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:25.012 [2024-07-15 07:47:09.475599] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2143d90 name Existed_Raid, state offline 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1594695 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1594695 ']' 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1594695 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1594695 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:25.012 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1594695' 00:11:25.012 killing process with pid 1594695 00:11:25.013 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1594695 00:11:25.013 [2024-07-15 07:47:09.760890] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:25.013 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1594695 00:11:25.013 [2024-07-15 07:47:09.761510] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:25.275 07:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:25.275 00:11:25.275 real 0m9.316s 00:11:25.275 user 0m16.943s 00:11:25.275 sys 0m1.387s 00:11:25.275 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:25.275 07:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:25.275 ************************************ 00:11:25.275 END TEST raid_state_function_test_sb 00:11:25.275 ************************************ 00:11:25.275 07:47:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:25.275 07:47:09 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:25.275 07:47:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:25.275 07:47:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:25.275 07:47:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:25.275 ************************************ 00:11:25.275 START TEST raid_superblock_test 00:11:25.275 ************************************ 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1596611 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1596611 /var/tmp/spdk-raid.sock 00:11:25.275 07:47:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1596611 ']' 00:11:25.276 07:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:25.276 07:47:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:25.276 07:47:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:25.276 07:47:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:25.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:25.276 07:47:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:25.276 07:47:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:25.276 [2024-07-15 07:47:10.020194] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:11:25.276 [2024-07-15 07:47:10.020246] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1596611 ] 00:11:25.536 [2024-07-15 07:47:10.115805] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:25.536 [2024-07-15 07:47:10.193509] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:25.536 [2024-07-15 07:47:10.242060] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.536 [2024-07-15 07:47:10.242089] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:26.473 07:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:26.473 malloc1 00:11:26.473 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:26.733 [2024-07-15 07:47:11.253312] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:26.733 [2024-07-15 07:47:11.253347] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.733 [2024-07-15 07:47:11.253358] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fba20 00:11:26.733 [2024-07-15 07:47:11.253365] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.733 [2024-07-15 07:47:11.254662] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.733 [2024-07-15 07:47:11.254683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:26.733 pt1 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:26.733 malloc2 00:11:26.733 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:26.993 [2024-07-15 07:47:11.624273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:26.993 [2024-07-15 07:47:11.624300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.993 [2024-07-15 07:47:11.624311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fc040 00:11:26.993 [2024-07-15 07:47:11.624318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.993 [2024-07-15 07:47:11.625507] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.993 [2024-07-15 07:47:11.625524] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:26.993 pt2 00:11:26.993 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:26.993 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:26.994 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:27.300 [2024-07-15 07:47:11.808757] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:27.300 [2024-07-15 07:47:11.809735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:27.300 [2024-07-15 07:47:11.809841] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18a83d0 00:11:27.300 [2024-07-15 07:47:11.809849] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:27.300 [2024-07-15 07:47:11.809988] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a77f0 00:11:27.300 [2024-07-15 07:47:11.810091] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18a83d0 00:11:27.300 [2024-07-15 07:47:11.810097] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18a83d0 00:11:27.300 [2024-07-15 07:47:11.810162] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.300 07:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:27.300 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:27.300 "name": "raid_bdev1", 00:11:27.300 "uuid": "ddbb871c-5d6b-4705-a1c6-d07252e29d9c", 00:11:27.300 "strip_size_kb": 64, 00:11:27.300 "state": "online", 00:11:27.300 "raid_level": "concat", 00:11:27.300 "superblock": true, 00:11:27.300 "num_base_bdevs": 2, 00:11:27.300 "num_base_bdevs_discovered": 2, 00:11:27.300 "num_base_bdevs_operational": 2, 00:11:27.300 "base_bdevs_list": [ 00:11:27.300 { 00:11:27.300 "name": "pt1", 00:11:27.300 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:27.300 "is_configured": true, 00:11:27.300 "data_offset": 2048, 00:11:27.300 "data_size": 63488 00:11:27.300 }, 00:11:27.300 { 00:11:27.300 "name": "pt2", 00:11:27.300 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:27.300 "is_configured": true, 00:11:27.300 "data_offset": 2048, 00:11:27.300 "data_size": 63488 00:11:27.300 } 00:11:27.300 ] 00:11:27.300 }' 00:11:27.300 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:27.300 07:47:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.868 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:27.868 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:27.868 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:27.868 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:27.868 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:27.868 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:27.868 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:27.868 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:28.127 [2024-07-15 07:47:12.735265] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:28.127 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:28.127 "name": "raid_bdev1", 00:11:28.127 "aliases": [ 00:11:28.127 "ddbb871c-5d6b-4705-a1c6-d07252e29d9c" 00:11:28.127 ], 00:11:28.127 "product_name": "Raid Volume", 00:11:28.127 "block_size": 512, 00:11:28.127 "num_blocks": 126976, 00:11:28.127 "uuid": "ddbb871c-5d6b-4705-a1c6-d07252e29d9c", 00:11:28.127 "assigned_rate_limits": { 00:11:28.127 "rw_ios_per_sec": 0, 00:11:28.127 "rw_mbytes_per_sec": 0, 00:11:28.127 "r_mbytes_per_sec": 0, 00:11:28.127 "w_mbytes_per_sec": 0 00:11:28.127 }, 00:11:28.127 "claimed": false, 00:11:28.127 "zoned": false, 00:11:28.127 "supported_io_types": { 00:11:28.127 "read": true, 00:11:28.127 "write": true, 00:11:28.127 "unmap": true, 00:11:28.127 "flush": true, 00:11:28.127 "reset": true, 00:11:28.127 "nvme_admin": false, 00:11:28.127 "nvme_io": false, 00:11:28.127 "nvme_io_md": false, 00:11:28.127 "write_zeroes": true, 00:11:28.127 "zcopy": false, 00:11:28.127 "get_zone_info": false, 00:11:28.127 "zone_management": false, 00:11:28.127 "zone_append": false, 00:11:28.127 "compare": false, 00:11:28.127 "compare_and_write": false, 00:11:28.127 "abort": false, 00:11:28.127 "seek_hole": false, 00:11:28.127 "seek_data": false, 00:11:28.127 "copy": false, 00:11:28.127 "nvme_iov_md": false 00:11:28.127 }, 00:11:28.127 "memory_domains": [ 00:11:28.127 { 00:11:28.127 "dma_device_id": "system", 00:11:28.127 "dma_device_type": 1 00:11:28.127 }, 00:11:28.127 { 00:11:28.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.127 "dma_device_type": 2 00:11:28.127 }, 00:11:28.127 { 00:11:28.127 "dma_device_id": "system", 00:11:28.127 "dma_device_type": 1 00:11:28.127 }, 00:11:28.127 { 00:11:28.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.127 "dma_device_type": 2 00:11:28.127 } 00:11:28.127 ], 00:11:28.127 "driver_specific": { 00:11:28.127 "raid": { 00:11:28.127 "uuid": "ddbb871c-5d6b-4705-a1c6-d07252e29d9c", 00:11:28.127 "strip_size_kb": 64, 00:11:28.127 "state": "online", 00:11:28.127 "raid_level": "concat", 00:11:28.127 "superblock": true, 00:11:28.127 "num_base_bdevs": 2, 00:11:28.127 "num_base_bdevs_discovered": 2, 00:11:28.127 "num_base_bdevs_operational": 2, 00:11:28.127 "base_bdevs_list": [ 00:11:28.127 { 00:11:28.127 "name": "pt1", 00:11:28.127 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:28.127 "is_configured": true, 00:11:28.127 "data_offset": 2048, 00:11:28.127 "data_size": 63488 00:11:28.127 }, 00:11:28.127 { 00:11:28.127 "name": "pt2", 00:11:28.127 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:28.127 "is_configured": true, 00:11:28.127 "data_offset": 2048, 00:11:28.127 "data_size": 63488 00:11:28.127 } 00:11:28.127 ] 00:11:28.127 } 00:11:28.127 } 00:11:28.127 }' 00:11:28.127 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:28.127 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:28.127 pt2' 00:11:28.128 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:28.128 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.128 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:28.387 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.387 "name": "pt1", 00:11:28.387 "aliases": [ 00:11:28.387 "00000000-0000-0000-0000-000000000001" 00:11:28.387 ], 00:11:28.387 "product_name": "passthru", 00:11:28.387 "block_size": 512, 00:11:28.387 "num_blocks": 65536, 00:11:28.387 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:28.387 "assigned_rate_limits": { 00:11:28.387 "rw_ios_per_sec": 0, 00:11:28.387 "rw_mbytes_per_sec": 0, 00:11:28.387 "r_mbytes_per_sec": 0, 00:11:28.387 "w_mbytes_per_sec": 0 00:11:28.387 }, 00:11:28.387 "claimed": true, 00:11:28.387 "claim_type": "exclusive_write", 00:11:28.387 "zoned": false, 00:11:28.387 "supported_io_types": { 00:11:28.387 "read": true, 00:11:28.387 "write": true, 00:11:28.387 "unmap": true, 00:11:28.387 "flush": true, 00:11:28.387 "reset": true, 00:11:28.387 "nvme_admin": false, 00:11:28.387 "nvme_io": false, 00:11:28.387 "nvme_io_md": false, 00:11:28.387 "write_zeroes": true, 00:11:28.387 "zcopy": true, 00:11:28.387 "get_zone_info": false, 00:11:28.387 "zone_management": false, 00:11:28.387 "zone_append": false, 00:11:28.387 "compare": false, 00:11:28.387 "compare_and_write": false, 00:11:28.387 "abort": true, 00:11:28.387 "seek_hole": false, 00:11:28.387 "seek_data": false, 00:11:28.387 "copy": true, 00:11:28.387 "nvme_iov_md": false 00:11:28.387 }, 00:11:28.387 "memory_domains": [ 00:11:28.387 { 00:11:28.387 "dma_device_id": "system", 00:11:28.387 "dma_device_type": 1 00:11:28.387 }, 00:11:28.387 { 00:11:28.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.387 "dma_device_type": 2 00:11:28.387 } 00:11:28.387 ], 00:11:28.387 "driver_specific": { 00:11:28.387 "passthru": { 00:11:28.387 "name": "pt1", 00:11:28.387 "base_bdev_name": "malloc1" 00:11:28.387 } 00:11:28.387 } 00:11:28.387 }' 00:11:28.387 07:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.387 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.387 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.387 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.387 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.647 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:28.906 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.906 "name": "pt2", 00:11:28.906 "aliases": [ 00:11:28.906 "00000000-0000-0000-0000-000000000002" 00:11:28.906 ], 00:11:28.906 "product_name": "passthru", 00:11:28.906 "block_size": 512, 00:11:28.906 "num_blocks": 65536, 00:11:28.906 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:28.906 "assigned_rate_limits": { 00:11:28.906 "rw_ios_per_sec": 0, 00:11:28.906 "rw_mbytes_per_sec": 0, 00:11:28.906 "r_mbytes_per_sec": 0, 00:11:28.906 "w_mbytes_per_sec": 0 00:11:28.906 }, 00:11:28.906 "claimed": true, 00:11:28.906 "claim_type": "exclusive_write", 00:11:28.906 "zoned": false, 00:11:28.906 "supported_io_types": { 00:11:28.906 "read": true, 00:11:28.906 "write": true, 00:11:28.906 "unmap": true, 00:11:28.906 "flush": true, 00:11:28.906 "reset": true, 00:11:28.906 "nvme_admin": false, 00:11:28.906 "nvme_io": false, 00:11:28.906 "nvme_io_md": false, 00:11:28.906 "write_zeroes": true, 00:11:28.906 "zcopy": true, 00:11:28.906 "get_zone_info": false, 00:11:28.906 "zone_management": false, 00:11:28.906 "zone_append": false, 00:11:28.906 "compare": false, 00:11:28.906 "compare_and_write": false, 00:11:28.906 "abort": true, 00:11:28.906 "seek_hole": false, 00:11:28.906 "seek_data": false, 00:11:28.906 "copy": true, 00:11:28.906 "nvme_iov_md": false 00:11:28.906 }, 00:11:28.906 "memory_domains": [ 00:11:28.906 { 00:11:28.906 "dma_device_id": "system", 00:11:28.906 "dma_device_type": 1 00:11:28.906 }, 00:11:28.906 { 00:11:28.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.906 "dma_device_type": 2 00:11:28.906 } 00:11:28.906 ], 00:11:28.906 "driver_specific": { 00:11:28.906 "passthru": { 00:11:28.906 "name": "pt2", 00:11:28.906 "base_bdev_name": "malloc2" 00:11:28.906 } 00:11:28.906 } 00:11:28.906 }' 00:11:28.906 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.906 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.906 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.906 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:29.166 07:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:29.425 [2024-07-15 07:47:14.074642] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:29.426 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ddbb871c-5d6b-4705-a1c6-d07252e29d9c 00:11:29.426 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ddbb871c-5d6b-4705-a1c6-d07252e29d9c ']' 00:11:29.426 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:29.686 [2024-07-15 07:47:14.266925] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:29.686 [2024-07-15 07:47:14.266935] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:29.686 [2024-07-15 07:47:14.266971] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:29.686 [2024-07-15 07:47:14.267004] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:29.686 [2024-07-15 07:47:14.267011] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a83d0 name raid_bdev1, state offline 00:11:29.686 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.686 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:29.946 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:29.946 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:29.946 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:29.946 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:29.946 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:29.946 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:30.206 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:30.206 07:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:30.467 [2024-07-15 07:47:15.205266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:30.467 [2024-07-15 07:47:15.206331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:30.467 [2024-07-15 07:47:15.206373] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:30.467 [2024-07-15 07:47:15.206400] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:30.467 [2024-07-15 07:47:15.206410] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:30.467 [2024-07-15 07:47:15.206416] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16fb070 name raid_bdev1, state configuring 00:11:30.467 request: 00:11:30.467 { 00:11:30.467 "name": "raid_bdev1", 00:11:30.467 "raid_level": "concat", 00:11:30.467 "base_bdevs": [ 00:11:30.467 "malloc1", 00:11:30.467 "malloc2" 00:11:30.467 ], 00:11:30.467 "strip_size_kb": 64, 00:11:30.467 "superblock": false, 00:11:30.467 "method": "bdev_raid_create", 00:11:30.467 "req_id": 1 00:11:30.467 } 00:11:30.467 Got JSON-RPC error response 00:11:30.467 response: 00:11:30.467 { 00:11:30.467 "code": -17, 00:11:30.467 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:30.467 } 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:30.467 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.727 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:30.727 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:30.727 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:30.727 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:30.987 [2024-07-15 07:47:15.574144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:30.987 [2024-07-15 07:47:15.574165] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:30.987 [2024-07-15 07:47:15.574175] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fce00 00:11:30.987 [2024-07-15 07:47:15.574182] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:30.987 [2024-07-15 07:47:15.575466] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:30.987 [2024-07-15 07:47:15.575485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:30.987 [2024-07-15 07:47:15.575534] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:30.987 [2024-07-15 07:47:15.575552] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:30.987 pt1 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.987 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:31.247 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.247 "name": "raid_bdev1", 00:11:31.247 "uuid": "ddbb871c-5d6b-4705-a1c6-d07252e29d9c", 00:11:31.247 "strip_size_kb": 64, 00:11:31.247 "state": "configuring", 00:11:31.247 "raid_level": "concat", 00:11:31.247 "superblock": true, 00:11:31.247 "num_base_bdevs": 2, 00:11:31.247 "num_base_bdevs_discovered": 1, 00:11:31.247 "num_base_bdevs_operational": 2, 00:11:31.247 "base_bdevs_list": [ 00:11:31.247 { 00:11:31.247 "name": "pt1", 00:11:31.247 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:31.247 "is_configured": true, 00:11:31.247 "data_offset": 2048, 00:11:31.247 "data_size": 63488 00:11:31.247 }, 00:11:31.247 { 00:11:31.247 "name": null, 00:11:31.247 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:31.247 "is_configured": false, 00:11:31.247 "data_offset": 2048, 00:11:31.247 "data_size": 63488 00:11:31.247 } 00:11:31.247 ] 00:11:31.247 }' 00:11:31.247 07:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.247 07:47:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:31.831 [2024-07-15 07:47:16.444341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:31.831 [2024-07-15 07:47:16.444367] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:31.831 [2024-07-15 07:47:16.444377] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fc3e0 00:11:31.831 [2024-07-15 07:47:16.444383] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:31.831 [2024-07-15 07:47:16.444638] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:31.831 [2024-07-15 07:47:16.444648] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:31.831 [2024-07-15 07:47:16.444687] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:31.831 [2024-07-15 07:47:16.444698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:31.831 [2024-07-15 07:47:16.444778] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16fa680 00:11:31.831 [2024-07-15 07:47:16.444785] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:31.831 [2024-07-15 07:47:16.444919] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a77f0 00:11:31.831 [2024-07-15 07:47:16.445013] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16fa680 00:11:31.831 [2024-07-15 07:47:16.445018] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16fa680 00:11:31.831 [2024-07-15 07:47:16.445089] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:31.831 pt2 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.831 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:32.092 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.092 "name": "raid_bdev1", 00:11:32.092 "uuid": "ddbb871c-5d6b-4705-a1c6-d07252e29d9c", 00:11:32.092 "strip_size_kb": 64, 00:11:32.092 "state": "online", 00:11:32.092 "raid_level": "concat", 00:11:32.092 "superblock": true, 00:11:32.092 "num_base_bdevs": 2, 00:11:32.092 "num_base_bdevs_discovered": 2, 00:11:32.092 "num_base_bdevs_operational": 2, 00:11:32.092 "base_bdevs_list": [ 00:11:32.092 { 00:11:32.092 "name": "pt1", 00:11:32.092 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:32.092 "is_configured": true, 00:11:32.092 "data_offset": 2048, 00:11:32.092 "data_size": 63488 00:11:32.092 }, 00:11:32.092 { 00:11:32.092 "name": "pt2", 00:11:32.092 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:32.092 "is_configured": true, 00:11:32.092 "data_offset": 2048, 00:11:32.092 "data_size": 63488 00:11:32.092 } 00:11:32.092 ] 00:11:32.092 }' 00:11:32.092 07:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.092 07:47:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:32.662 [2024-07-15 07:47:17.366869] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:32.662 "name": "raid_bdev1", 00:11:32.662 "aliases": [ 00:11:32.662 "ddbb871c-5d6b-4705-a1c6-d07252e29d9c" 00:11:32.662 ], 00:11:32.662 "product_name": "Raid Volume", 00:11:32.662 "block_size": 512, 00:11:32.662 "num_blocks": 126976, 00:11:32.662 "uuid": "ddbb871c-5d6b-4705-a1c6-d07252e29d9c", 00:11:32.662 "assigned_rate_limits": { 00:11:32.662 "rw_ios_per_sec": 0, 00:11:32.662 "rw_mbytes_per_sec": 0, 00:11:32.662 "r_mbytes_per_sec": 0, 00:11:32.662 "w_mbytes_per_sec": 0 00:11:32.662 }, 00:11:32.662 "claimed": false, 00:11:32.662 "zoned": false, 00:11:32.662 "supported_io_types": { 00:11:32.662 "read": true, 00:11:32.662 "write": true, 00:11:32.662 "unmap": true, 00:11:32.662 "flush": true, 00:11:32.662 "reset": true, 00:11:32.662 "nvme_admin": false, 00:11:32.662 "nvme_io": false, 00:11:32.662 "nvme_io_md": false, 00:11:32.662 "write_zeroes": true, 00:11:32.662 "zcopy": false, 00:11:32.662 "get_zone_info": false, 00:11:32.662 "zone_management": false, 00:11:32.662 "zone_append": false, 00:11:32.662 "compare": false, 00:11:32.662 "compare_and_write": false, 00:11:32.662 "abort": false, 00:11:32.662 "seek_hole": false, 00:11:32.662 "seek_data": false, 00:11:32.662 "copy": false, 00:11:32.662 "nvme_iov_md": false 00:11:32.662 }, 00:11:32.662 "memory_domains": [ 00:11:32.662 { 00:11:32.662 "dma_device_id": "system", 00:11:32.662 "dma_device_type": 1 00:11:32.662 }, 00:11:32.662 { 00:11:32.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.662 "dma_device_type": 2 00:11:32.662 }, 00:11:32.662 { 00:11:32.662 "dma_device_id": "system", 00:11:32.662 "dma_device_type": 1 00:11:32.662 }, 00:11:32.662 { 00:11:32.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.662 "dma_device_type": 2 00:11:32.662 } 00:11:32.662 ], 00:11:32.662 "driver_specific": { 00:11:32.662 "raid": { 00:11:32.662 "uuid": "ddbb871c-5d6b-4705-a1c6-d07252e29d9c", 00:11:32.662 "strip_size_kb": 64, 00:11:32.662 "state": "online", 00:11:32.662 "raid_level": "concat", 00:11:32.662 "superblock": true, 00:11:32.662 "num_base_bdevs": 2, 00:11:32.662 "num_base_bdevs_discovered": 2, 00:11:32.662 "num_base_bdevs_operational": 2, 00:11:32.662 "base_bdevs_list": [ 00:11:32.662 { 00:11:32.662 "name": "pt1", 00:11:32.662 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:32.662 "is_configured": true, 00:11:32.662 "data_offset": 2048, 00:11:32.662 "data_size": 63488 00:11:32.662 }, 00:11:32.662 { 00:11:32.662 "name": "pt2", 00:11:32.662 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:32.662 "is_configured": true, 00:11:32.662 "data_offset": 2048, 00:11:32.662 "data_size": 63488 00:11:32.662 } 00:11:32.662 ] 00:11:32.662 } 00:11:32.662 } 00:11:32.662 }' 00:11:32.662 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:32.922 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:32.922 pt2' 00:11:32.922 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:32.922 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:32.922 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:32.922 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:32.922 "name": "pt1", 00:11:32.922 "aliases": [ 00:11:32.922 "00000000-0000-0000-0000-000000000001" 00:11:32.922 ], 00:11:32.922 "product_name": "passthru", 00:11:32.922 "block_size": 512, 00:11:32.922 "num_blocks": 65536, 00:11:32.922 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:32.922 "assigned_rate_limits": { 00:11:32.922 "rw_ios_per_sec": 0, 00:11:32.922 "rw_mbytes_per_sec": 0, 00:11:32.922 "r_mbytes_per_sec": 0, 00:11:32.922 "w_mbytes_per_sec": 0 00:11:32.922 }, 00:11:32.922 "claimed": true, 00:11:32.922 "claim_type": "exclusive_write", 00:11:32.922 "zoned": false, 00:11:32.922 "supported_io_types": { 00:11:32.922 "read": true, 00:11:32.922 "write": true, 00:11:32.922 "unmap": true, 00:11:32.923 "flush": true, 00:11:32.923 "reset": true, 00:11:32.923 "nvme_admin": false, 00:11:32.923 "nvme_io": false, 00:11:32.923 "nvme_io_md": false, 00:11:32.923 "write_zeroes": true, 00:11:32.923 "zcopy": true, 00:11:32.923 "get_zone_info": false, 00:11:32.923 "zone_management": false, 00:11:32.923 "zone_append": false, 00:11:32.923 "compare": false, 00:11:32.923 "compare_and_write": false, 00:11:32.923 "abort": true, 00:11:32.923 "seek_hole": false, 00:11:32.923 "seek_data": false, 00:11:32.923 "copy": true, 00:11:32.923 "nvme_iov_md": false 00:11:32.923 }, 00:11:32.923 "memory_domains": [ 00:11:32.923 { 00:11:32.923 "dma_device_id": "system", 00:11:32.923 "dma_device_type": 1 00:11:32.923 }, 00:11:32.923 { 00:11:32.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.923 "dma_device_type": 2 00:11:32.923 } 00:11:32.923 ], 00:11:32.923 "driver_specific": { 00:11:32.923 "passthru": { 00:11:32.923 "name": "pt1", 00:11:32.923 "base_bdev_name": "malloc1" 00:11:32.923 } 00:11:32.923 } 00:11:32.923 }' 00:11:32.923 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.183 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.442 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:33.442 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:33.442 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:33.442 07:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:33.443 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:33.443 "name": "pt2", 00:11:33.443 "aliases": [ 00:11:33.443 "00000000-0000-0000-0000-000000000002" 00:11:33.443 ], 00:11:33.443 "product_name": "passthru", 00:11:33.443 "block_size": 512, 00:11:33.443 "num_blocks": 65536, 00:11:33.443 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:33.443 "assigned_rate_limits": { 00:11:33.443 "rw_ios_per_sec": 0, 00:11:33.443 "rw_mbytes_per_sec": 0, 00:11:33.443 "r_mbytes_per_sec": 0, 00:11:33.443 "w_mbytes_per_sec": 0 00:11:33.443 }, 00:11:33.443 "claimed": true, 00:11:33.443 "claim_type": "exclusive_write", 00:11:33.443 "zoned": false, 00:11:33.443 "supported_io_types": { 00:11:33.443 "read": true, 00:11:33.443 "write": true, 00:11:33.443 "unmap": true, 00:11:33.443 "flush": true, 00:11:33.443 "reset": true, 00:11:33.443 "nvme_admin": false, 00:11:33.443 "nvme_io": false, 00:11:33.443 "nvme_io_md": false, 00:11:33.443 "write_zeroes": true, 00:11:33.443 "zcopy": true, 00:11:33.443 "get_zone_info": false, 00:11:33.443 "zone_management": false, 00:11:33.443 "zone_append": false, 00:11:33.443 "compare": false, 00:11:33.443 "compare_and_write": false, 00:11:33.443 "abort": true, 00:11:33.443 "seek_hole": false, 00:11:33.443 "seek_data": false, 00:11:33.443 "copy": true, 00:11:33.443 "nvme_iov_md": false 00:11:33.443 }, 00:11:33.443 "memory_domains": [ 00:11:33.443 { 00:11:33.443 "dma_device_id": "system", 00:11:33.443 "dma_device_type": 1 00:11:33.443 }, 00:11:33.443 { 00:11:33.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.443 "dma_device_type": 2 00:11:33.443 } 00:11:33.443 ], 00:11:33.443 "driver_specific": { 00:11:33.443 "passthru": { 00:11:33.443 "name": "pt2", 00:11:33.443 "base_bdev_name": "malloc2" 00:11:33.443 } 00:11:33.443 } 00:11:33.443 }' 00:11:33.443 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:33.704 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.964 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.964 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:33.964 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:33.964 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:33.964 [2024-07-15 07:47:18.714260] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ddbb871c-5d6b-4705-a1c6-d07252e29d9c '!=' ddbb871c-5d6b-4705-a1c6-d07252e29d9c ']' 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1596611 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1596611 ']' 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1596611 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1596611 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:34.224 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:34.225 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1596611' 00:11:34.225 killing process with pid 1596611 00:11:34.225 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1596611 00:11:34.225 [2024-07-15 07:47:18.789202] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:34.225 [2024-07-15 07:47:18.789241] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:34.225 [2024-07-15 07:47:18.789270] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:34.225 [2024-07-15 07:47:18.789276] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16fa680 name raid_bdev1, state offline 00:11:34.225 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1596611 00:11:34.225 [2024-07-15 07:47:18.798270] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:34.225 07:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:34.225 00:11:34.225 real 0m8.954s 00:11:34.225 user 0m16.350s 00:11:34.225 sys 0m1.324s 00:11:34.225 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:34.225 07:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.225 ************************************ 00:11:34.225 END TEST raid_superblock_test 00:11:34.225 ************************************ 00:11:34.225 07:47:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:34.225 07:47:18 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:11:34.225 07:47:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:34.225 07:47:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:34.225 07:47:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:34.485 ************************************ 00:11:34.485 START TEST raid_read_error_test 00:11:34.485 ************************************ 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:34.485 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:34.486 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:34.486 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:34.486 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:34.486 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:34.486 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:34.486 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:34.486 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:34.486 07:47:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Zitg4mDdSb 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1598363 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1598363 /var/tmp/spdk-raid.sock 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1598363 ']' 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:34.486 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:34.486 07:47:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.486 [2024-07-15 07:47:19.061802] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:11:34.486 [2024-07-15 07:47:19.061868] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1598363 ] 00:11:34.486 [2024-07-15 07:47:19.154156] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:34.486 [2024-07-15 07:47:19.221695] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:34.746 [2024-07-15 07:47:19.265916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:34.746 [2024-07-15 07:47:19.265940] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.316 07:47:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:35.316 07:47:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:35.316 07:47:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:35.316 07:47:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:35.577 BaseBdev1_malloc 00:11:35.577 07:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:35.577 true 00:11:35.577 07:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:35.837 [2024-07-15 07:47:20.464799] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:35.837 [2024-07-15 07:47:20.464832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:35.837 [2024-07-15 07:47:20.464843] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1027b50 00:11:35.837 [2024-07-15 07:47:20.464850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:35.837 [2024-07-15 07:47:20.466166] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:35.837 [2024-07-15 07:47:20.466185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:35.837 BaseBdev1 00:11:35.837 07:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:35.837 07:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:36.117 BaseBdev2_malloc 00:11:36.117 07:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:36.117 true 00:11:36.390 07:47:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:36.390 [2024-07-15 07:47:21.064087] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:36.390 [2024-07-15 07:47:21.064117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.390 [2024-07-15 07:47:21.064128] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x100bea0 00:11:36.390 [2024-07-15 07:47:21.064134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.390 [2024-07-15 07:47:21.065317] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.390 [2024-07-15 07:47:21.065335] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:36.390 BaseBdev2 00:11:36.390 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:36.963 [2024-07-15 07:47:21.593429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:36.963 [2024-07-15 07:47:21.594425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:36.963 [2024-07-15 07:47:21.594563] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe75360 00:11:36.963 [2024-07-15 07:47:21.594571] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:36.963 [2024-07-15 07:47:21.594719] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1010090 00:11:36.963 [2024-07-15 07:47:21.594832] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe75360 00:11:36.963 [2024-07-15 07:47:21.594837] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe75360 00:11:36.963 [2024-07-15 07:47:21.594912] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.963 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:37.223 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.223 "name": "raid_bdev1", 00:11:37.223 "uuid": "65d5158c-ae21-4ac5-a723-d8fb309658a3", 00:11:37.223 "strip_size_kb": 64, 00:11:37.223 "state": "online", 00:11:37.223 "raid_level": "concat", 00:11:37.223 "superblock": true, 00:11:37.223 "num_base_bdevs": 2, 00:11:37.223 "num_base_bdevs_discovered": 2, 00:11:37.223 "num_base_bdevs_operational": 2, 00:11:37.223 "base_bdevs_list": [ 00:11:37.223 { 00:11:37.223 "name": "BaseBdev1", 00:11:37.223 "uuid": "f1fe24d2-93cc-5d3a-8def-738f6d25fb90", 00:11:37.223 "is_configured": true, 00:11:37.223 "data_offset": 2048, 00:11:37.223 "data_size": 63488 00:11:37.223 }, 00:11:37.223 { 00:11:37.223 "name": "BaseBdev2", 00:11:37.223 "uuid": "03644b54-9441-5722-a75c-228f6850ac15", 00:11:37.223 "is_configured": true, 00:11:37.223 "data_offset": 2048, 00:11:37.223 "data_size": 63488 00:11:37.223 } 00:11:37.223 ] 00:11:37.223 }' 00:11:37.223 07:47:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.223 07:47:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.793 07:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:37.794 07:47:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:37.794 [2024-07-15 07:47:22.491906] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x100ffd0 00:11:38.734 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.994 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:39.253 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.253 "name": "raid_bdev1", 00:11:39.253 "uuid": "65d5158c-ae21-4ac5-a723-d8fb309658a3", 00:11:39.253 "strip_size_kb": 64, 00:11:39.253 "state": "online", 00:11:39.253 "raid_level": "concat", 00:11:39.253 "superblock": true, 00:11:39.253 "num_base_bdevs": 2, 00:11:39.253 "num_base_bdevs_discovered": 2, 00:11:39.253 "num_base_bdevs_operational": 2, 00:11:39.253 "base_bdevs_list": [ 00:11:39.253 { 00:11:39.253 "name": "BaseBdev1", 00:11:39.253 "uuid": "f1fe24d2-93cc-5d3a-8def-738f6d25fb90", 00:11:39.253 "is_configured": true, 00:11:39.253 "data_offset": 2048, 00:11:39.253 "data_size": 63488 00:11:39.253 }, 00:11:39.253 { 00:11:39.253 "name": "BaseBdev2", 00:11:39.253 "uuid": "03644b54-9441-5722-a75c-228f6850ac15", 00:11:39.253 "is_configured": true, 00:11:39.253 "data_offset": 2048, 00:11:39.253 "data_size": 63488 00:11:39.253 } 00:11:39.253 ] 00:11:39.253 }' 00:11:39.253 07:47:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.253 07:47:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.823 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:39.823 [2024-07-15 07:47:24.526375] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:39.823 [2024-07-15 07:47:24.526404] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:39.823 [2024-07-15 07:47:24.528993] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:39.823 [2024-07-15 07:47:24.529018] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:39.823 [2024-07-15 07:47:24.529039] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:39.823 [2024-07-15 07:47:24.529045] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe75360 name raid_bdev1, state offline 00:11:39.823 0 00:11:39.823 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1598363 00:11:39.823 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1598363 ']' 00:11:39.823 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1598363 00:11:39.823 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:39.823 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:39.823 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1598363 00:11:40.083 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:40.083 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:40.083 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1598363' 00:11:40.083 killing process with pid 1598363 00:11:40.083 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1598363 00:11:40.083 [2024-07-15 07:47:24.598110] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:40.083 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1598363 00:11:40.083 [2024-07-15 07:47:24.603827] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Zitg4mDdSb 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:11:40.084 00:11:40.084 real 0m5.748s 00:11:40.084 user 0m9.144s 00:11:40.084 sys 0m0.823s 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:40.084 07:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.084 ************************************ 00:11:40.084 END TEST raid_read_error_test 00:11:40.084 ************************************ 00:11:40.084 07:47:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:40.084 07:47:24 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:11:40.084 07:47:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:40.084 07:47:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:40.084 07:47:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:40.084 ************************************ 00:11:40.084 START TEST raid_write_error_test 00:11:40.084 ************************************ 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.fb0jScGCJy 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1599409 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1599409 /var/tmp/spdk-raid.sock 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1599409 ']' 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:40.084 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:40.084 07:47:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.344 [2024-07-15 07:47:24.883012] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:11:40.344 [2024-07-15 07:47:24.883061] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1599409 ] 00:11:40.344 [2024-07-15 07:47:24.971953] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:40.344 [2024-07-15 07:47:25.040207] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:40.344 [2024-07-15 07:47:25.082506] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.344 [2024-07-15 07:47:25.082530] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.286 07:47:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:41.286 07:47:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:41.286 07:47:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:41.286 07:47:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:41.286 BaseBdev1_malloc 00:11:41.286 07:47:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:41.546 true 00:11:41.546 07:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:41.546 [2024-07-15 07:47:26.277251] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:41.546 [2024-07-15 07:47:26.277283] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:41.546 [2024-07-15 07:47:26.277295] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f1b50 00:11:41.546 [2024-07-15 07:47:26.277302] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:41.546 [2024-07-15 07:47:26.278626] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:41.546 [2024-07-15 07:47:26.278646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:41.546 BaseBdev1 00:11:41.546 07:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:41.546 07:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:41.806 BaseBdev2_malloc 00:11:41.806 07:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:42.066 true 00:11:42.066 07:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:42.326 [2024-07-15 07:47:26.848660] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:42.326 [2024-07-15 07:47:26.848688] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:42.326 [2024-07-15 07:47:26.848699] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27d5ea0 00:11:42.326 [2024-07-15 07:47:26.848705] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:42.326 [2024-07-15 07:47:26.849996] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:42.326 [2024-07-15 07:47:26.850015] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:42.326 BaseBdev2 00:11:42.326 07:47:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:42.326 [2024-07-15 07:47:27.041169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:42.326 [2024-07-15 07:47:27.042184] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:42.326 [2024-07-15 07:47:27.042321] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x263f360 00:11:42.326 [2024-07-15 07:47:27.042329] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:42.326 [2024-07-15 07:47:27.042477] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27da090 00:11:42.326 [2024-07-15 07:47:27.042590] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x263f360 00:11:42.326 [2024-07-15 07:47:27.042596] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x263f360 00:11:42.326 [2024-07-15 07:47:27.042671] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.326 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:42.586 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.586 "name": "raid_bdev1", 00:11:42.586 "uuid": "f6ab66dd-d122-447d-a85a-9c3de7bde1ac", 00:11:42.586 "strip_size_kb": 64, 00:11:42.586 "state": "online", 00:11:42.586 "raid_level": "concat", 00:11:42.586 "superblock": true, 00:11:42.587 "num_base_bdevs": 2, 00:11:42.587 "num_base_bdevs_discovered": 2, 00:11:42.587 "num_base_bdevs_operational": 2, 00:11:42.587 "base_bdevs_list": [ 00:11:42.587 { 00:11:42.587 "name": "BaseBdev1", 00:11:42.587 "uuid": "2c2aa9ad-acb1-58b9-a03b-b4958dec469f", 00:11:42.587 "is_configured": true, 00:11:42.587 "data_offset": 2048, 00:11:42.587 "data_size": 63488 00:11:42.587 }, 00:11:42.587 { 00:11:42.587 "name": "BaseBdev2", 00:11:42.587 "uuid": "e5603cb7-50a3-5080-94bb-c68fca62cb13", 00:11:42.587 "is_configured": true, 00:11:42.587 "data_offset": 2048, 00:11:42.587 "data_size": 63488 00:11:42.587 } 00:11:42.587 ] 00:11:42.587 }' 00:11:42.587 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.587 07:47:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.158 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:43.158 07:47:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:43.419 [2024-07-15 07:47:27.915584] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27d9fd0 00:11:44.358 07:47:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.358 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:44.618 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.618 "name": "raid_bdev1", 00:11:44.618 "uuid": "f6ab66dd-d122-447d-a85a-9c3de7bde1ac", 00:11:44.618 "strip_size_kb": 64, 00:11:44.618 "state": "online", 00:11:44.618 "raid_level": "concat", 00:11:44.618 "superblock": true, 00:11:44.618 "num_base_bdevs": 2, 00:11:44.618 "num_base_bdevs_discovered": 2, 00:11:44.618 "num_base_bdevs_operational": 2, 00:11:44.618 "base_bdevs_list": [ 00:11:44.618 { 00:11:44.618 "name": "BaseBdev1", 00:11:44.618 "uuid": "2c2aa9ad-acb1-58b9-a03b-b4958dec469f", 00:11:44.618 "is_configured": true, 00:11:44.618 "data_offset": 2048, 00:11:44.618 "data_size": 63488 00:11:44.618 }, 00:11:44.618 { 00:11:44.618 "name": "BaseBdev2", 00:11:44.618 "uuid": "e5603cb7-50a3-5080-94bb-c68fca62cb13", 00:11:44.619 "is_configured": true, 00:11:44.619 "data_offset": 2048, 00:11:44.619 "data_size": 63488 00:11:44.619 } 00:11:44.619 ] 00:11:44.619 }' 00:11:44.619 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.619 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.188 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:45.188 [2024-07-15 07:47:29.918087] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:45.188 [2024-07-15 07:47:29.918114] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:45.188 [2024-07-15 07:47:29.920695] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:45.188 [2024-07-15 07:47:29.920724] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:45.188 [2024-07-15 07:47:29.920744] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:45.188 [2024-07-15 07:47:29.920750] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x263f360 name raid_bdev1, state offline 00:11:45.188 0 00:11:45.188 07:47:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1599409 00:11:45.188 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1599409 ']' 00:11:45.188 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1599409 00:11:45.188 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:45.448 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:45.448 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1599409 00:11:45.448 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:45.448 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:45.448 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1599409' 00:11:45.448 killing process with pid 1599409 00:11:45.448 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1599409 00:11:45.448 [2024-07-15 07:47:29.990541] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:45.448 07:47:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1599409 00:11:45.448 [2024-07-15 07:47:29.996180] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.fb0jScGCJy 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:45.448 00:11:45.448 real 0m5.318s 00:11:45.448 user 0m8.332s 00:11:45.448 sys 0m0.780s 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:45.448 07:47:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.448 ************************************ 00:11:45.448 END TEST raid_write_error_test 00:11:45.448 ************************************ 00:11:45.448 07:47:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:45.448 07:47:30 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:45.448 07:47:30 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:11:45.448 07:47:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:45.448 07:47:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:45.448 07:47:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:45.709 ************************************ 00:11:45.709 START TEST raid_state_function_test 00:11:45.709 ************************************ 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1600418 00:11:45.709 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1600418' 00:11:45.710 Process raid pid: 1600418 00:11:45.710 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1600418 /var/tmp/spdk-raid.sock 00:11:45.710 07:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:45.710 07:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1600418 ']' 00:11:45.710 07:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:45.710 07:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:45.710 07:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:45.710 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:45.710 07:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:45.710 07:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.710 [2024-07-15 07:47:30.284897] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:11:45.710 [2024-07-15 07:47:30.284951] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:45.710 [2024-07-15 07:47:30.375860] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:45.710 [2024-07-15 07:47:30.443672] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:45.970 [2024-07-15 07:47:30.489986] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:45.970 [2024-07-15 07:47:30.490011] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:46.540 07:47:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:46.540 07:47:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:46.540 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:46.540 [2024-07-15 07:47:31.289781] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:46.540 [2024-07-15 07:47:31.289810] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:46.540 [2024-07-15 07:47:31.289816] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:46.540 [2024-07-15 07:47:31.289822] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:46.801 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:46.801 "name": "Existed_Raid", 00:11:46.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:46.801 "strip_size_kb": 0, 00:11:46.801 "state": "configuring", 00:11:46.801 "raid_level": "raid1", 00:11:46.801 "superblock": false, 00:11:46.801 "num_base_bdevs": 2, 00:11:46.801 "num_base_bdevs_discovered": 0, 00:11:46.801 "num_base_bdevs_operational": 2, 00:11:46.801 "base_bdevs_list": [ 00:11:46.801 { 00:11:46.801 "name": "BaseBdev1", 00:11:46.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:46.801 "is_configured": false, 00:11:46.801 "data_offset": 0, 00:11:46.801 "data_size": 0 00:11:46.801 }, 00:11:46.801 { 00:11:46.802 "name": "BaseBdev2", 00:11:46.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:46.802 "is_configured": false, 00:11:46.802 "data_offset": 0, 00:11:46.802 "data_size": 0 00:11:46.802 } 00:11:46.802 ] 00:11:46.802 }' 00:11:46.802 07:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:46.802 07:47:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:47.372 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:47.632 [2024-07-15 07:47:32.203995] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:47.632 [2024-07-15 07:47:32.204012] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x179e6b0 name Existed_Raid, state configuring 00:11:47.632 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:47.894 [2024-07-15 07:47:32.396490] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:47.894 [2024-07-15 07:47:32.396509] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:47.894 [2024-07-15 07:47:32.396514] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:47.894 [2024-07-15 07:47:32.396520] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:47.894 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:47.894 [2024-07-15 07:47:32.587600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:47.894 BaseBdev1 00:11:47.894 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:47.894 07:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:47.894 07:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:47.894 07:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:47.894 07:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:47.894 07:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:47.894 07:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:48.155 07:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:48.416 [ 00:11:48.416 { 00:11:48.416 "name": "BaseBdev1", 00:11:48.416 "aliases": [ 00:11:48.416 "46924534-beb4-4b08-8d5f-9ff6994eeb3e" 00:11:48.416 ], 00:11:48.416 "product_name": "Malloc disk", 00:11:48.416 "block_size": 512, 00:11:48.416 "num_blocks": 65536, 00:11:48.416 "uuid": "46924534-beb4-4b08-8d5f-9ff6994eeb3e", 00:11:48.416 "assigned_rate_limits": { 00:11:48.416 "rw_ios_per_sec": 0, 00:11:48.416 "rw_mbytes_per_sec": 0, 00:11:48.416 "r_mbytes_per_sec": 0, 00:11:48.416 "w_mbytes_per_sec": 0 00:11:48.416 }, 00:11:48.416 "claimed": true, 00:11:48.416 "claim_type": "exclusive_write", 00:11:48.417 "zoned": false, 00:11:48.417 "supported_io_types": { 00:11:48.417 "read": true, 00:11:48.417 "write": true, 00:11:48.417 "unmap": true, 00:11:48.417 "flush": true, 00:11:48.417 "reset": true, 00:11:48.417 "nvme_admin": false, 00:11:48.417 "nvme_io": false, 00:11:48.417 "nvme_io_md": false, 00:11:48.417 "write_zeroes": true, 00:11:48.417 "zcopy": true, 00:11:48.417 "get_zone_info": false, 00:11:48.417 "zone_management": false, 00:11:48.417 "zone_append": false, 00:11:48.417 "compare": false, 00:11:48.417 "compare_and_write": false, 00:11:48.417 "abort": true, 00:11:48.417 "seek_hole": false, 00:11:48.417 "seek_data": false, 00:11:48.417 "copy": true, 00:11:48.417 "nvme_iov_md": false 00:11:48.417 }, 00:11:48.417 "memory_domains": [ 00:11:48.417 { 00:11:48.417 "dma_device_id": "system", 00:11:48.417 "dma_device_type": 1 00:11:48.417 }, 00:11:48.417 { 00:11:48.417 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.417 "dma_device_type": 2 00:11:48.417 } 00:11:48.417 ], 00:11:48.417 "driver_specific": {} 00:11:48.417 } 00:11:48.417 ] 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.417 07:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:48.676 07:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.676 "name": "Existed_Raid", 00:11:48.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.676 "strip_size_kb": 0, 00:11:48.676 "state": "configuring", 00:11:48.676 "raid_level": "raid1", 00:11:48.676 "superblock": false, 00:11:48.676 "num_base_bdevs": 2, 00:11:48.676 "num_base_bdevs_discovered": 1, 00:11:48.676 "num_base_bdevs_operational": 2, 00:11:48.676 "base_bdevs_list": [ 00:11:48.676 { 00:11:48.676 "name": "BaseBdev1", 00:11:48.676 "uuid": "46924534-beb4-4b08-8d5f-9ff6994eeb3e", 00:11:48.676 "is_configured": true, 00:11:48.676 "data_offset": 0, 00:11:48.676 "data_size": 65536 00:11:48.676 }, 00:11:48.676 { 00:11:48.676 "name": "BaseBdev2", 00:11:48.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:48.676 "is_configured": false, 00:11:48.676 "data_offset": 0, 00:11:48.676 "data_size": 0 00:11:48.676 } 00:11:48.676 ] 00:11:48.676 }' 00:11:48.676 07:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.676 07:47:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:49.245 07:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:49.245 [2024-07-15 07:47:33.902933] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:49.245 [2024-07-15 07:47:33.902962] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x179dfa0 name Existed_Raid, state configuring 00:11:49.245 07:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:49.506 [2024-07-15 07:47:34.091433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:49.506 [2024-07-15 07:47:34.092583] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:49.506 [2024-07-15 07:47:34.092608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.506 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.766 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.766 "name": "Existed_Raid", 00:11:49.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.766 "strip_size_kb": 0, 00:11:49.766 "state": "configuring", 00:11:49.766 "raid_level": "raid1", 00:11:49.766 "superblock": false, 00:11:49.766 "num_base_bdevs": 2, 00:11:49.766 "num_base_bdevs_discovered": 1, 00:11:49.767 "num_base_bdevs_operational": 2, 00:11:49.767 "base_bdevs_list": [ 00:11:49.767 { 00:11:49.767 "name": "BaseBdev1", 00:11:49.767 "uuid": "46924534-beb4-4b08-8d5f-9ff6994eeb3e", 00:11:49.767 "is_configured": true, 00:11:49.767 "data_offset": 0, 00:11:49.767 "data_size": 65536 00:11:49.767 }, 00:11:49.767 { 00:11:49.767 "name": "BaseBdev2", 00:11:49.767 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.767 "is_configured": false, 00:11:49.767 "data_offset": 0, 00:11:49.767 "data_size": 0 00:11:49.767 } 00:11:49.767 ] 00:11:49.767 }' 00:11:49.767 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.767 07:47:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.337 07:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:50.337 [2024-07-15 07:47:35.026796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:50.337 [2024-07-15 07:47:35.026823] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x179ed90 00:11:50.337 [2024-07-15 07:47:35.026828] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:50.337 [2024-07-15 07:47:35.026971] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1942850 00:11:50.337 [2024-07-15 07:47:35.027064] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x179ed90 00:11:50.337 [2024-07-15 07:47:35.027070] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x179ed90 00:11:50.337 [2024-07-15 07:47:35.027190] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:50.337 BaseBdev2 00:11:50.337 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:50.337 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:50.337 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:50.337 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:50.337 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:50.337 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:50.337 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:50.598 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:50.859 [ 00:11:50.859 { 00:11:50.859 "name": "BaseBdev2", 00:11:50.859 "aliases": [ 00:11:50.859 "9ace28ff-8b36-4c2c-b4d0-138acfbea22d" 00:11:50.859 ], 00:11:50.859 "product_name": "Malloc disk", 00:11:50.859 "block_size": 512, 00:11:50.859 "num_blocks": 65536, 00:11:50.859 "uuid": "9ace28ff-8b36-4c2c-b4d0-138acfbea22d", 00:11:50.859 "assigned_rate_limits": { 00:11:50.859 "rw_ios_per_sec": 0, 00:11:50.859 "rw_mbytes_per_sec": 0, 00:11:50.859 "r_mbytes_per_sec": 0, 00:11:50.859 "w_mbytes_per_sec": 0 00:11:50.859 }, 00:11:50.859 "claimed": true, 00:11:50.859 "claim_type": "exclusive_write", 00:11:50.859 "zoned": false, 00:11:50.859 "supported_io_types": { 00:11:50.859 "read": true, 00:11:50.859 "write": true, 00:11:50.859 "unmap": true, 00:11:50.859 "flush": true, 00:11:50.859 "reset": true, 00:11:50.859 "nvme_admin": false, 00:11:50.859 "nvme_io": false, 00:11:50.859 "nvme_io_md": false, 00:11:50.859 "write_zeroes": true, 00:11:50.859 "zcopy": true, 00:11:50.859 "get_zone_info": false, 00:11:50.859 "zone_management": false, 00:11:50.859 "zone_append": false, 00:11:50.859 "compare": false, 00:11:50.859 "compare_and_write": false, 00:11:50.859 "abort": true, 00:11:50.859 "seek_hole": false, 00:11:50.859 "seek_data": false, 00:11:50.859 "copy": true, 00:11:50.859 "nvme_iov_md": false 00:11:50.859 }, 00:11:50.859 "memory_domains": [ 00:11:50.859 { 00:11:50.859 "dma_device_id": "system", 00:11:50.859 "dma_device_type": 1 00:11:50.859 }, 00:11:50.859 { 00:11:50.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.859 "dma_device_type": 2 00:11:50.859 } 00:11:50.859 ], 00:11:50.859 "driver_specific": {} 00:11:50.859 } 00:11:50.859 ] 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.859 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:51.119 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:51.119 "name": "Existed_Raid", 00:11:51.119 "uuid": "baf741a4-4fe3-444b-9a60-38fc90b1dcf4", 00:11:51.120 "strip_size_kb": 0, 00:11:51.120 "state": "online", 00:11:51.120 "raid_level": "raid1", 00:11:51.120 "superblock": false, 00:11:51.120 "num_base_bdevs": 2, 00:11:51.120 "num_base_bdevs_discovered": 2, 00:11:51.120 "num_base_bdevs_operational": 2, 00:11:51.120 "base_bdevs_list": [ 00:11:51.120 { 00:11:51.120 "name": "BaseBdev1", 00:11:51.120 "uuid": "46924534-beb4-4b08-8d5f-9ff6994eeb3e", 00:11:51.120 "is_configured": true, 00:11:51.120 "data_offset": 0, 00:11:51.120 "data_size": 65536 00:11:51.120 }, 00:11:51.120 { 00:11:51.120 "name": "BaseBdev2", 00:11:51.120 "uuid": "9ace28ff-8b36-4c2c-b4d0-138acfbea22d", 00:11:51.120 "is_configured": true, 00:11:51.120 "data_offset": 0, 00:11:51.120 "data_size": 65536 00:11:51.120 } 00:11:51.120 ] 00:11:51.120 }' 00:11:51.120 07:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:51.120 07:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:51.692 [2024-07-15 07:47:36.342342] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:51.692 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:51.692 "name": "Existed_Raid", 00:11:51.692 "aliases": [ 00:11:51.692 "baf741a4-4fe3-444b-9a60-38fc90b1dcf4" 00:11:51.692 ], 00:11:51.692 "product_name": "Raid Volume", 00:11:51.692 "block_size": 512, 00:11:51.692 "num_blocks": 65536, 00:11:51.692 "uuid": "baf741a4-4fe3-444b-9a60-38fc90b1dcf4", 00:11:51.692 "assigned_rate_limits": { 00:11:51.692 "rw_ios_per_sec": 0, 00:11:51.692 "rw_mbytes_per_sec": 0, 00:11:51.692 "r_mbytes_per_sec": 0, 00:11:51.692 "w_mbytes_per_sec": 0 00:11:51.692 }, 00:11:51.692 "claimed": false, 00:11:51.692 "zoned": false, 00:11:51.692 "supported_io_types": { 00:11:51.692 "read": true, 00:11:51.692 "write": true, 00:11:51.692 "unmap": false, 00:11:51.692 "flush": false, 00:11:51.692 "reset": true, 00:11:51.692 "nvme_admin": false, 00:11:51.692 "nvme_io": false, 00:11:51.692 "nvme_io_md": false, 00:11:51.692 "write_zeroes": true, 00:11:51.692 "zcopy": false, 00:11:51.692 "get_zone_info": false, 00:11:51.692 "zone_management": false, 00:11:51.692 "zone_append": false, 00:11:51.692 "compare": false, 00:11:51.692 "compare_and_write": false, 00:11:51.692 "abort": false, 00:11:51.692 "seek_hole": false, 00:11:51.692 "seek_data": false, 00:11:51.692 "copy": false, 00:11:51.692 "nvme_iov_md": false 00:11:51.692 }, 00:11:51.692 "memory_domains": [ 00:11:51.692 { 00:11:51.692 "dma_device_id": "system", 00:11:51.692 "dma_device_type": 1 00:11:51.692 }, 00:11:51.692 { 00:11:51.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.692 "dma_device_type": 2 00:11:51.692 }, 00:11:51.692 { 00:11:51.692 "dma_device_id": "system", 00:11:51.692 "dma_device_type": 1 00:11:51.692 }, 00:11:51.692 { 00:11:51.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.692 "dma_device_type": 2 00:11:51.692 } 00:11:51.692 ], 00:11:51.692 "driver_specific": { 00:11:51.692 "raid": { 00:11:51.692 "uuid": "baf741a4-4fe3-444b-9a60-38fc90b1dcf4", 00:11:51.692 "strip_size_kb": 0, 00:11:51.692 "state": "online", 00:11:51.692 "raid_level": "raid1", 00:11:51.692 "superblock": false, 00:11:51.692 "num_base_bdevs": 2, 00:11:51.692 "num_base_bdevs_discovered": 2, 00:11:51.692 "num_base_bdevs_operational": 2, 00:11:51.692 "base_bdevs_list": [ 00:11:51.692 { 00:11:51.692 "name": "BaseBdev1", 00:11:51.692 "uuid": "46924534-beb4-4b08-8d5f-9ff6994eeb3e", 00:11:51.692 "is_configured": true, 00:11:51.692 "data_offset": 0, 00:11:51.692 "data_size": 65536 00:11:51.692 }, 00:11:51.692 { 00:11:51.692 "name": "BaseBdev2", 00:11:51.692 "uuid": "9ace28ff-8b36-4c2c-b4d0-138acfbea22d", 00:11:51.692 "is_configured": true, 00:11:51.692 "data_offset": 0, 00:11:51.692 "data_size": 65536 00:11:51.692 } 00:11:51.693 ] 00:11:51.693 } 00:11:51.693 } 00:11:51.693 }' 00:11:51.693 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:51.693 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:51.693 BaseBdev2' 00:11:51.693 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:51.693 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:51.693 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:51.953 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:51.953 "name": "BaseBdev1", 00:11:51.953 "aliases": [ 00:11:51.953 "46924534-beb4-4b08-8d5f-9ff6994eeb3e" 00:11:51.953 ], 00:11:51.953 "product_name": "Malloc disk", 00:11:51.953 "block_size": 512, 00:11:51.953 "num_blocks": 65536, 00:11:51.953 "uuid": "46924534-beb4-4b08-8d5f-9ff6994eeb3e", 00:11:51.953 "assigned_rate_limits": { 00:11:51.953 "rw_ios_per_sec": 0, 00:11:51.953 "rw_mbytes_per_sec": 0, 00:11:51.953 "r_mbytes_per_sec": 0, 00:11:51.953 "w_mbytes_per_sec": 0 00:11:51.953 }, 00:11:51.953 "claimed": true, 00:11:51.953 "claim_type": "exclusive_write", 00:11:51.953 "zoned": false, 00:11:51.953 "supported_io_types": { 00:11:51.953 "read": true, 00:11:51.953 "write": true, 00:11:51.953 "unmap": true, 00:11:51.953 "flush": true, 00:11:51.953 "reset": true, 00:11:51.953 "nvme_admin": false, 00:11:51.953 "nvme_io": false, 00:11:51.953 "nvme_io_md": false, 00:11:51.953 "write_zeroes": true, 00:11:51.953 "zcopy": true, 00:11:51.953 "get_zone_info": false, 00:11:51.953 "zone_management": false, 00:11:51.953 "zone_append": false, 00:11:51.953 "compare": false, 00:11:51.953 "compare_and_write": false, 00:11:51.953 "abort": true, 00:11:51.953 "seek_hole": false, 00:11:51.953 "seek_data": false, 00:11:51.953 "copy": true, 00:11:51.953 "nvme_iov_md": false 00:11:51.953 }, 00:11:51.953 "memory_domains": [ 00:11:51.953 { 00:11:51.953 "dma_device_id": "system", 00:11:51.953 "dma_device_type": 1 00:11:51.953 }, 00:11:51.953 { 00:11:51.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.953 "dma_device_type": 2 00:11:51.953 } 00:11:51.953 ], 00:11:51.953 "driver_specific": {} 00:11:51.953 }' 00:11:51.953 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.953 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:51.953 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:51.953 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:52.213 07:47:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:52.474 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:52.474 "name": "BaseBdev2", 00:11:52.474 "aliases": [ 00:11:52.474 "9ace28ff-8b36-4c2c-b4d0-138acfbea22d" 00:11:52.474 ], 00:11:52.474 "product_name": "Malloc disk", 00:11:52.474 "block_size": 512, 00:11:52.474 "num_blocks": 65536, 00:11:52.474 "uuid": "9ace28ff-8b36-4c2c-b4d0-138acfbea22d", 00:11:52.474 "assigned_rate_limits": { 00:11:52.474 "rw_ios_per_sec": 0, 00:11:52.474 "rw_mbytes_per_sec": 0, 00:11:52.474 "r_mbytes_per_sec": 0, 00:11:52.474 "w_mbytes_per_sec": 0 00:11:52.474 }, 00:11:52.474 "claimed": true, 00:11:52.474 "claim_type": "exclusive_write", 00:11:52.474 "zoned": false, 00:11:52.474 "supported_io_types": { 00:11:52.474 "read": true, 00:11:52.474 "write": true, 00:11:52.474 "unmap": true, 00:11:52.474 "flush": true, 00:11:52.474 "reset": true, 00:11:52.474 "nvme_admin": false, 00:11:52.474 "nvme_io": false, 00:11:52.474 "nvme_io_md": false, 00:11:52.474 "write_zeroes": true, 00:11:52.474 "zcopy": true, 00:11:52.474 "get_zone_info": false, 00:11:52.474 "zone_management": false, 00:11:52.474 "zone_append": false, 00:11:52.474 "compare": false, 00:11:52.474 "compare_and_write": false, 00:11:52.474 "abort": true, 00:11:52.474 "seek_hole": false, 00:11:52.474 "seek_data": false, 00:11:52.474 "copy": true, 00:11:52.474 "nvme_iov_md": false 00:11:52.474 }, 00:11:52.474 "memory_domains": [ 00:11:52.474 { 00:11:52.474 "dma_device_id": "system", 00:11:52.474 "dma_device_type": 1 00:11:52.474 }, 00:11:52.474 { 00:11:52.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.474 "dma_device_type": 2 00:11:52.474 } 00:11:52.474 ], 00:11:52.474 "driver_specific": {} 00:11:52.474 }' 00:11:52.474 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.474 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:52.474 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:52.474 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.734 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:52.734 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:52.734 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.734 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:52.734 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:52.734 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:52.734 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:53.026 [2024-07-15 07:47:37.669521] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.026 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.286 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.286 "name": "Existed_Raid", 00:11:53.286 "uuid": "baf741a4-4fe3-444b-9a60-38fc90b1dcf4", 00:11:53.286 "strip_size_kb": 0, 00:11:53.286 "state": "online", 00:11:53.286 "raid_level": "raid1", 00:11:53.286 "superblock": false, 00:11:53.286 "num_base_bdevs": 2, 00:11:53.286 "num_base_bdevs_discovered": 1, 00:11:53.286 "num_base_bdevs_operational": 1, 00:11:53.286 "base_bdevs_list": [ 00:11:53.286 { 00:11:53.286 "name": null, 00:11:53.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.286 "is_configured": false, 00:11:53.286 "data_offset": 0, 00:11:53.286 "data_size": 65536 00:11:53.286 }, 00:11:53.286 { 00:11:53.286 "name": "BaseBdev2", 00:11:53.286 "uuid": "9ace28ff-8b36-4c2c-b4d0-138acfbea22d", 00:11:53.286 "is_configured": true, 00:11:53.286 "data_offset": 0, 00:11:53.286 "data_size": 65536 00:11:53.286 } 00:11:53.286 ] 00:11:53.286 }' 00:11:53.286 07:47:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.286 07:47:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.857 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:53.857 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:53.857 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.857 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:53.857 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:53.857 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:53.857 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:54.117 [2024-07-15 07:47:38.772328] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:54.117 [2024-07-15 07:47:38.772385] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:54.117 [2024-07-15 07:47:38.778423] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:54.117 [2024-07-15 07:47:38.778447] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:54.117 [2024-07-15 07:47:38.778453] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x179ed90 name Existed_Raid, state offline 00:11:54.117 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:54.117 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:54.117 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.117 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1600418 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1600418 ']' 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1600418 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:54.377 07:47:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1600418 00:11:54.377 07:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:54.377 07:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:54.377 07:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1600418' 00:11:54.377 killing process with pid 1600418 00:11:54.377 07:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1600418 00:11:54.377 [2024-07-15 07:47:39.033256] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:54.377 07:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1600418 00:11:54.377 [2024-07-15 07:47:39.033858] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:54.637 00:11:54.637 real 0m8.939s 00:11:54.637 user 0m16.261s 00:11:54.637 sys 0m1.358s 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.637 ************************************ 00:11:54.637 END TEST raid_state_function_test 00:11:54.637 ************************************ 00:11:54.637 07:47:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:54.637 07:47:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:11:54.637 07:47:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:54.637 07:47:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:54.637 07:47:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:54.637 ************************************ 00:11:54.637 START TEST raid_state_function_test_sb 00:11:54.637 ************************************ 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:54.637 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1602175 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1602175' 00:11:54.638 Process raid pid: 1602175 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1602175 /var/tmp/spdk-raid.sock 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1602175 ']' 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:54.638 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:54.638 07:47:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.638 [2024-07-15 07:47:39.287210] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:11:54.638 [2024-07-15 07:47:39.287261] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:54.638 [2024-07-15 07:47:39.372793] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:54.898 [2024-07-15 07:47:39.437914] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:54.898 [2024-07-15 07:47:39.479735] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:54.898 [2024-07-15 07:47:39.479758] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.467 07:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:55.467 07:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:55.467 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:55.727 [2024-07-15 07:47:40.286890] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:55.727 [2024-07-15 07:47:40.286923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:55.727 [2024-07-15 07:47:40.286929] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:55.727 [2024-07-15 07:47:40.286935] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.727 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.986 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.986 "name": "Existed_Raid", 00:11:55.986 "uuid": "2a3a53c4-303a-4ff8-a424-658fb19aba5b", 00:11:55.986 "strip_size_kb": 0, 00:11:55.986 "state": "configuring", 00:11:55.986 "raid_level": "raid1", 00:11:55.986 "superblock": true, 00:11:55.986 "num_base_bdevs": 2, 00:11:55.986 "num_base_bdevs_discovered": 0, 00:11:55.986 "num_base_bdevs_operational": 2, 00:11:55.986 "base_bdevs_list": [ 00:11:55.986 { 00:11:55.986 "name": "BaseBdev1", 00:11:55.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.986 "is_configured": false, 00:11:55.986 "data_offset": 0, 00:11:55.986 "data_size": 0 00:11:55.986 }, 00:11:55.986 { 00:11:55.986 "name": "BaseBdev2", 00:11:55.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.986 "is_configured": false, 00:11:55.986 "data_offset": 0, 00:11:55.986 "data_size": 0 00:11:55.986 } 00:11:55.986 ] 00:11:55.986 }' 00:11:55.986 07:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.986 07:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:56.555 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:56.555 [2024-07-15 07:47:41.229164] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:56.555 [2024-07-15 07:47:41.229189] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x140d6b0 name Existed_Raid, state configuring 00:11:56.555 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:56.814 [2024-07-15 07:47:41.421667] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:56.814 [2024-07-15 07:47:41.421688] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:56.814 [2024-07-15 07:47:41.421693] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:56.814 [2024-07-15 07:47:41.421698] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:56.814 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:57.073 [2024-07-15 07:47:41.608627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:57.073 BaseBdev1 00:11:57.073 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:57.073 07:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:57.073 07:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:57.073 07:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:57.073 07:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:57.073 07:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:57.073 07:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:57.073 07:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:57.367 [ 00:11:57.367 { 00:11:57.367 "name": "BaseBdev1", 00:11:57.367 "aliases": [ 00:11:57.367 "1c54f100-4160-47a7-b115-a65ef33bbabd" 00:11:57.367 ], 00:11:57.367 "product_name": "Malloc disk", 00:11:57.367 "block_size": 512, 00:11:57.367 "num_blocks": 65536, 00:11:57.367 "uuid": "1c54f100-4160-47a7-b115-a65ef33bbabd", 00:11:57.367 "assigned_rate_limits": { 00:11:57.367 "rw_ios_per_sec": 0, 00:11:57.367 "rw_mbytes_per_sec": 0, 00:11:57.367 "r_mbytes_per_sec": 0, 00:11:57.367 "w_mbytes_per_sec": 0 00:11:57.367 }, 00:11:57.367 "claimed": true, 00:11:57.367 "claim_type": "exclusive_write", 00:11:57.367 "zoned": false, 00:11:57.367 "supported_io_types": { 00:11:57.367 "read": true, 00:11:57.367 "write": true, 00:11:57.367 "unmap": true, 00:11:57.367 "flush": true, 00:11:57.367 "reset": true, 00:11:57.367 "nvme_admin": false, 00:11:57.367 "nvme_io": false, 00:11:57.367 "nvme_io_md": false, 00:11:57.367 "write_zeroes": true, 00:11:57.367 "zcopy": true, 00:11:57.367 "get_zone_info": false, 00:11:57.367 "zone_management": false, 00:11:57.367 "zone_append": false, 00:11:57.367 "compare": false, 00:11:57.367 "compare_and_write": false, 00:11:57.367 "abort": true, 00:11:57.367 "seek_hole": false, 00:11:57.367 "seek_data": false, 00:11:57.367 "copy": true, 00:11:57.367 "nvme_iov_md": false 00:11:57.367 }, 00:11:57.367 "memory_domains": [ 00:11:57.367 { 00:11:57.367 "dma_device_id": "system", 00:11:57.367 "dma_device_type": 1 00:11:57.367 }, 00:11:57.367 { 00:11:57.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.367 "dma_device_type": 2 00:11:57.367 } 00:11:57.367 ], 00:11:57.367 "driver_specific": {} 00:11:57.367 } 00:11:57.367 ] 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.367 07:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.626 07:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.626 "name": "Existed_Raid", 00:11:57.626 "uuid": "cb68bf0b-d8e0-4b76-8470-40570cfa0629", 00:11:57.626 "strip_size_kb": 0, 00:11:57.626 "state": "configuring", 00:11:57.626 "raid_level": "raid1", 00:11:57.626 "superblock": true, 00:11:57.626 "num_base_bdevs": 2, 00:11:57.626 "num_base_bdevs_discovered": 1, 00:11:57.626 "num_base_bdevs_operational": 2, 00:11:57.626 "base_bdevs_list": [ 00:11:57.626 { 00:11:57.626 "name": "BaseBdev1", 00:11:57.626 "uuid": "1c54f100-4160-47a7-b115-a65ef33bbabd", 00:11:57.626 "is_configured": true, 00:11:57.626 "data_offset": 2048, 00:11:57.626 "data_size": 63488 00:11:57.626 }, 00:11:57.626 { 00:11:57.626 "name": "BaseBdev2", 00:11:57.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.626 "is_configured": false, 00:11:57.626 "data_offset": 0, 00:11:57.626 "data_size": 0 00:11:57.626 } 00:11:57.626 ] 00:11:57.626 }' 00:11:57.626 07:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.626 07:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:58.193 07:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:58.193 [2024-07-15 07:47:42.891885] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:58.193 [2024-07-15 07:47:42.891915] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x140cfa0 name Existed_Raid, state configuring 00:11:58.193 07:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:58.452 [2024-07-15 07:47:43.080390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:58.452 [2024-07-15 07:47:43.081569] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:58.452 [2024-07-15 07:47:43.081592] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.452 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:58.710 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.710 "name": "Existed_Raid", 00:11:58.710 "uuid": "91839be2-f63f-4384-a2c1-b084ec12d9e9", 00:11:58.710 "strip_size_kb": 0, 00:11:58.710 "state": "configuring", 00:11:58.710 "raid_level": "raid1", 00:11:58.710 "superblock": true, 00:11:58.710 "num_base_bdevs": 2, 00:11:58.710 "num_base_bdevs_discovered": 1, 00:11:58.710 "num_base_bdevs_operational": 2, 00:11:58.710 "base_bdevs_list": [ 00:11:58.710 { 00:11:58.710 "name": "BaseBdev1", 00:11:58.710 "uuid": "1c54f100-4160-47a7-b115-a65ef33bbabd", 00:11:58.710 "is_configured": true, 00:11:58.710 "data_offset": 2048, 00:11:58.710 "data_size": 63488 00:11:58.710 }, 00:11:58.710 { 00:11:58.710 "name": "BaseBdev2", 00:11:58.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.710 "is_configured": false, 00:11:58.710 "data_offset": 0, 00:11:58.710 "data_size": 0 00:11:58.710 } 00:11:58.710 ] 00:11:58.710 }' 00:11:58.710 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.710 07:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:59.279 07:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:59.279 [2024-07-15 07:47:44.015696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:59.279 [2024-07-15 07:47:44.015813] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x140dd90 00:11:59.279 [2024-07-15 07:47:44.015821] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:59.279 [2024-07-15 07:47:44.015961] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15c18d0 00:11:59.279 [2024-07-15 07:47:44.016052] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x140dd90 00:11:59.279 [2024-07-15 07:47:44.016058] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x140dd90 00:11:59.279 [2024-07-15 07:47:44.016123] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:59.279 BaseBdev2 00:11:59.279 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:59.279 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:59.279 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:59.279 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:59.279 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:59.279 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:59.279 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:59.848 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:00.108 [ 00:12:00.108 { 00:12:00.108 "name": "BaseBdev2", 00:12:00.108 "aliases": [ 00:12:00.108 "f6e37890-b6c2-48bc-b477-93303f416f04" 00:12:00.108 ], 00:12:00.108 "product_name": "Malloc disk", 00:12:00.108 "block_size": 512, 00:12:00.108 "num_blocks": 65536, 00:12:00.108 "uuid": "f6e37890-b6c2-48bc-b477-93303f416f04", 00:12:00.108 "assigned_rate_limits": { 00:12:00.108 "rw_ios_per_sec": 0, 00:12:00.108 "rw_mbytes_per_sec": 0, 00:12:00.108 "r_mbytes_per_sec": 0, 00:12:00.108 "w_mbytes_per_sec": 0 00:12:00.108 }, 00:12:00.108 "claimed": true, 00:12:00.108 "claim_type": "exclusive_write", 00:12:00.108 "zoned": false, 00:12:00.108 "supported_io_types": { 00:12:00.108 "read": true, 00:12:00.108 "write": true, 00:12:00.108 "unmap": true, 00:12:00.108 "flush": true, 00:12:00.108 "reset": true, 00:12:00.108 "nvme_admin": false, 00:12:00.108 "nvme_io": false, 00:12:00.108 "nvme_io_md": false, 00:12:00.108 "write_zeroes": true, 00:12:00.108 "zcopy": true, 00:12:00.108 "get_zone_info": false, 00:12:00.108 "zone_management": false, 00:12:00.108 "zone_append": false, 00:12:00.108 "compare": false, 00:12:00.108 "compare_and_write": false, 00:12:00.108 "abort": true, 00:12:00.108 "seek_hole": false, 00:12:00.108 "seek_data": false, 00:12:00.108 "copy": true, 00:12:00.108 "nvme_iov_md": false 00:12:00.108 }, 00:12:00.108 "memory_domains": [ 00:12:00.108 { 00:12:00.108 "dma_device_id": "system", 00:12:00.108 "dma_device_type": 1 00:12:00.108 }, 00:12:00.108 { 00:12:00.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.108 "dma_device_type": 2 00:12:00.108 } 00:12:00.108 ], 00:12:00.108 "driver_specific": {} 00:12:00.108 } 00:12:00.108 ] 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.108 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.368 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.368 "name": "Existed_Raid", 00:12:00.368 "uuid": "91839be2-f63f-4384-a2c1-b084ec12d9e9", 00:12:00.368 "strip_size_kb": 0, 00:12:00.368 "state": "online", 00:12:00.368 "raid_level": "raid1", 00:12:00.368 "superblock": true, 00:12:00.368 "num_base_bdevs": 2, 00:12:00.368 "num_base_bdevs_discovered": 2, 00:12:00.368 "num_base_bdevs_operational": 2, 00:12:00.368 "base_bdevs_list": [ 00:12:00.368 { 00:12:00.368 "name": "BaseBdev1", 00:12:00.368 "uuid": "1c54f100-4160-47a7-b115-a65ef33bbabd", 00:12:00.368 "is_configured": true, 00:12:00.368 "data_offset": 2048, 00:12:00.368 "data_size": 63488 00:12:00.368 }, 00:12:00.368 { 00:12:00.368 "name": "BaseBdev2", 00:12:00.368 "uuid": "f6e37890-b6c2-48bc-b477-93303f416f04", 00:12:00.368 "is_configured": true, 00:12:00.368 "data_offset": 2048, 00:12:00.368 "data_size": 63488 00:12:00.368 } 00:12:00.368 ] 00:12:00.368 }' 00:12:00.368 07:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.368 07:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:00.938 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:00.938 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:00.938 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:00.938 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:00.938 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:00.938 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:00.938 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:00.938 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:01.199 [2024-07-15 07:47:45.700176] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:01.199 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:01.199 "name": "Existed_Raid", 00:12:01.199 "aliases": [ 00:12:01.199 "91839be2-f63f-4384-a2c1-b084ec12d9e9" 00:12:01.199 ], 00:12:01.199 "product_name": "Raid Volume", 00:12:01.199 "block_size": 512, 00:12:01.199 "num_blocks": 63488, 00:12:01.199 "uuid": "91839be2-f63f-4384-a2c1-b084ec12d9e9", 00:12:01.199 "assigned_rate_limits": { 00:12:01.199 "rw_ios_per_sec": 0, 00:12:01.199 "rw_mbytes_per_sec": 0, 00:12:01.199 "r_mbytes_per_sec": 0, 00:12:01.199 "w_mbytes_per_sec": 0 00:12:01.199 }, 00:12:01.199 "claimed": false, 00:12:01.199 "zoned": false, 00:12:01.199 "supported_io_types": { 00:12:01.199 "read": true, 00:12:01.199 "write": true, 00:12:01.199 "unmap": false, 00:12:01.199 "flush": false, 00:12:01.199 "reset": true, 00:12:01.199 "nvme_admin": false, 00:12:01.199 "nvme_io": false, 00:12:01.199 "nvme_io_md": false, 00:12:01.199 "write_zeroes": true, 00:12:01.199 "zcopy": false, 00:12:01.199 "get_zone_info": false, 00:12:01.199 "zone_management": false, 00:12:01.199 "zone_append": false, 00:12:01.199 "compare": false, 00:12:01.199 "compare_and_write": false, 00:12:01.199 "abort": false, 00:12:01.199 "seek_hole": false, 00:12:01.199 "seek_data": false, 00:12:01.199 "copy": false, 00:12:01.199 "nvme_iov_md": false 00:12:01.199 }, 00:12:01.199 "memory_domains": [ 00:12:01.199 { 00:12:01.199 "dma_device_id": "system", 00:12:01.199 "dma_device_type": 1 00:12:01.199 }, 00:12:01.199 { 00:12:01.199 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.199 "dma_device_type": 2 00:12:01.199 }, 00:12:01.199 { 00:12:01.199 "dma_device_id": "system", 00:12:01.199 "dma_device_type": 1 00:12:01.199 }, 00:12:01.199 { 00:12:01.199 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.199 "dma_device_type": 2 00:12:01.199 } 00:12:01.199 ], 00:12:01.199 "driver_specific": { 00:12:01.199 "raid": { 00:12:01.199 "uuid": "91839be2-f63f-4384-a2c1-b084ec12d9e9", 00:12:01.199 "strip_size_kb": 0, 00:12:01.199 "state": "online", 00:12:01.199 "raid_level": "raid1", 00:12:01.199 "superblock": true, 00:12:01.199 "num_base_bdevs": 2, 00:12:01.199 "num_base_bdevs_discovered": 2, 00:12:01.199 "num_base_bdevs_operational": 2, 00:12:01.199 "base_bdevs_list": [ 00:12:01.199 { 00:12:01.199 "name": "BaseBdev1", 00:12:01.199 "uuid": "1c54f100-4160-47a7-b115-a65ef33bbabd", 00:12:01.199 "is_configured": true, 00:12:01.199 "data_offset": 2048, 00:12:01.199 "data_size": 63488 00:12:01.199 }, 00:12:01.199 { 00:12:01.199 "name": "BaseBdev2", 00:12:01.199 "uuid": "f6e37890-b6c2-48bc-b477-93303f416f04", 00:12:01.199 "is_configured": true, 00:12:01.199 "data_offset": 2048, 00:12:01.199 "data_size": 63488 00:12:01.199 } 00:12:01.199 ] 00:12:01.199 } 00:12:01.199 } 00:12:01.199 }' 00:12:01.199 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:01.199 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:01.199 BaseBdev2' 00:12:01.199 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:01.199 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:01.199 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:01.458 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:01.458 "name": "BaseBdev1", 00:12:01.458 "aliases": [ 00:12:01.458 "1c54f100-4160-47a7-b115-a65ef33bbabd" 00:12:01.458 ], 00:12:01.458 "product_name": "Malloc disk", 00:12:01.458 "block_size": 512, 00:12:01.458 "num_blocks": 65536, 00:12:01.458 "uuid": "1c54f100-4160-47a7-b115-a65ef33bbabd", 00:12:01.458 "assigned_rate_limits": { 00:12:01.458 "rw_ios_per_sec": 0, 00:12:01.458 "rw_mbytes_per_sec": 0, 00:12:01.458 "r_mbytes_per_sec": 0, 00:12:01.458 "w_mbytes_per_sec": 0 00:12:01.458 }, 00:12:01.458 "claimed": true, 00:12:01.458 "claim_type": "exclusive_write", 00:12:01.458 "zoned": false, 00:12:01.458 "supported_io_types": { 00:12:01.458 "read": true, 00:12:01.459 "write": true, 00:12:01.459 "unmap": true, 00:12:01.459 "flush": true, 00:12:01.459 "reset": true, 00:12:01.459 "nvme_admin": false, 00:12:01.459 "nvme_io": false, 00:12:01.459 "nvme_io_md": false, 00:12:01.459 "write_zeroes": true, 00:12:01.459 "zcopy": true, 00:12:01.459 "get_zone_info": false, 00:12:01.459 "zone_management": false, 00:12:01.459 "zone_append": false, 00:12:01.459 "compare": false, 00:12:01.459 "compare_and_write": false, 00:12:01.459 "abort": true, 00:12:01.459 "seek_hole": false, 00:12:01.459 "seek_data": false, 00:12:01.459 "copy": true, 00:12:01.459 "nvme_iov_md": false 00:12:01.459 }, 00:12:01.459 "memory_domains": [ 00:12:01.459 { 00:12:01.459 "dma_device_id": "system", 00:12:01.459 "dma_device_type": 1 00:12:01.459 }, 00:12:01.459 { 00:12:01.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.459 "dma_device_type": 2 00:12:01.459 } 00:12:01.459 ], 00:12:01.459 "driver_specific": {} 00:12:01.459 }' 00:12:01.459 07:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:01.459 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:01.459 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:01.459 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:01.459 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:01.459 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:01.459 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:01.459 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:01.719 "name": "BaseBdev2", 00:12:01.719 "aliases": [ 00:12:01.719 "f6e37890-b6c2-48bc-b477-93303f416f04" 00:12:01.719 ], 00:12:01.719 "product_name": "Malloc disk", 00:12:01.719 "block_size": 512, 00:12:01.719 "num_blocks": 65536, 00:12:01.719 "uuid": "f6e37890-b6c2-48bc-b477-93303f416f04", 00:12:01.719 "assigned_rate_limits": { 00:12:01.719 "rw_ios_per_sec": 0, 00:12:01.719 "rw_mbytes_per_sec": 0, 00:12:01.719 "r_mbytes_per_sec": 0, 00:12:01.719 "w_mbytes_per_sec": 0 00:12:01.719 }, 00:12:01.719 "claimed": true, 00:12:01.719 "claim_type": "exclusive_write", 00:12:01.719 "zoned": false, 00:12:01.719 "supported_io_types": { 00:12:01.719 "read": true, 00:12:01.719 "write": true, 00:12:01.719 "unmap": true, 00:12:01.719 "flush": true, 00:12:01.719 "reset": true, 00:12:01.719 "nvme_admin": false, 00:12:01.719 "nvme_io": false, 00:12:01.719 "nvme_io_md": false, 00:12:01.719 "write_zeroes": true, 00:12:01.719 "zcopy": true, 00:12:01.719 "get_zone_info": false, 00:12:01.719 "zone_management": false, 00:12:01.719 "zone_append": false, 00:12:01.719 "compare": false, 00:12:01.719 "compare_and_write": false, 00:12:01.719 "abort": true, 00:12:01.719 "seek_hole": false, 00:12:01.719 "seek_data": false, 00:12:01.719 "copy": true, 00:12:01.719 "nvme_iov_md": false 00:12:01.719 }, 00:12:01.719 "memory_domains": [ 00:12:01.719 { 00:12:01.719 "dma_device_id": "system", 00:12:01.719 "dma_device_type": 1 00:12:01.719 }, 00:12:01.719 { 00:12:01.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.719 "dma_device_type": 2 00:12:01.719 } 00:12:01.719 ], 00:12:01.719 "driver_specific": {} 00:12:01.719 }' 00:12:01.719 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:01.979 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:01.979 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:01.979 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:01.979 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:01.980 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:01.980 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:01.980 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:02.239 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:02.239 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:02.239 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:02.239 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:02.239 07:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:02.499 [2024-07-15 07:47:47.007313] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.499 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.499 "name": "Existed_Raid", 00:12:02.499 "uuid": "91839be2-f63f-4384-a2c1-b084ec12d9e9", 00:12:02.499 "strip_size_kb": 0, 00:12:02.499 "state": "online", 00:12:02.499 "raid_level": "raid1", 00:12:02.499 "superblock": true, 00:12:02.499 "num_base_bdevs": 2, 00:12:02.499 "num_base_bdevs_discovered": 1, 00:12:02.499 "num_base_bdevs_operational": 1, 00:12:02.499 "base_bdevs_list": [ 00:12:02.499 { 00:12:02.499 "name": null, 00:12:02.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:02.499 "is_configured": false, 00:12:02.499 "data_offset": 2048, 00:12:02.499 "data_size": 63488 00:12:02.499 }, 00:12:02.499 { 00:12:02.499 "name": "BaseBdev2", 00:12:02.499 "uuid": "f6e37890-b6c2-48bc-b477-93303f416f04", 00:12:02.500 "is_configured": true, 00:12:02.500 "data_offset": 2048, 00:12:02.500 "data_size": 63488 00:12:02.500 } 00:12:02.500 ] 00:12:02.500 }' 00:12:02.500 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.500 07:47:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:03.071 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:03.071 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:03.071 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.071 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:03.330 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:03.330 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:03.331 07:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:03.591 [2024-07-15 07:47:48.150201] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:03.591 [2024-07-15 07:47:48.150268] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:03.591 [2024-07-15 07:47:48.156256] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:03.591 [2024-07-15 07:47:48.156283] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:03.591 [2024-07-15 07:47:48.156289] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x140dd90 name Existed_Raid, state offline 00:12:03.591 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:03.591 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:03.591 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.591 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1602175 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1602175 ']' 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1602175 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1602175 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1602175' 00:12:03.851 killing process with pid 1602175 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1602175 00:12:03.851 [2024-07-15 07:47:48.413571] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1602175 00:12:03.851 [2024-07-15 07:47:48.414161] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:03.851 00:12:03.851 real 0m9.311s 00:12:03.851 user 0m16.960s 00:12:03.851 sys 0m1.403s 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:03.851 07:47:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:03.851 ************************************ 00:12:03.851 END TEST raid_state_function_test_sb 00:12:03.851 ************************************ 00:12:03.851 07:47:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:03.851 07:47:48 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:12:03.852 07:47:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:03.852 07:47:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:03.852 07:47:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:04.111 ************************************ 00:12:04.111 START TEST raid_superblock_test 00:12:04.111 ************************************ 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:04.111 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1603940 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1603940 /var/tmp/spdk-raid.sock 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1603940 ']' 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:04.112 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:04.112 07:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.112 [2024-07-15 07:47:48.681233] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:12:04.112 [2024-07-15 07:47:48.681288] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1603940 ] 00:12:04.112 [2024-07-15 07:47:48.774825] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:04.112 [2024-07-15 07:47:48.849140] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:04.372 [2024-07-15 07:47:48.891758] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:04.372 [2024-07-15 07:47:48.891781] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:04.944 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:04.944 malloc1 00:12:05.205 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:05.205 [2024-07-15 07:47:49.886617] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:05.205 [2024-07-15 07:47:49.886652] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:05.205 [2024-07-15 07:47:49.886663] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148aa20 00:12:05.205 [2024-07-15 07:47:49.886669] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:05.206 [2024-07-15 07:47:49.888060] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:05.206 [2024-07-15 07:47:49.888080] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:05.206 pt1 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:05.206 07:47:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:05.466 malloc2 00:12:05.466 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:05.726 [2024-07-15 07:47:50.285563] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:05.726 [2024-07-15 07:47:50.285595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:05.726 [2024-07-15 07:47:50.285607] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148b040 00:12:05.726 [2024-07-15 07:47:50.285614] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:05.726 [2024-07-15 07:47:50.286768] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:05.726 [2024-07-15 07:47:50.286785] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:05.726 pt2 00:12:05.726 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:05.726 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:05.726 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:05.726 [2024-07-15 07:47:50.478053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:05.726 [2024-07-15 07:47:50.479016] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:05.726 [2024-07-15 07:47:50.479121] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16373d0 00:12:05.726 [2024-07-15 07:47:50.479128] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:05.726 [2024-07-15 07:47:50.479267] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a1910 00:12:05.726 [2024-07-15 07:47:50.479374] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16373d0 00:12:05.726 [2024-07-15 07:47:50.479380] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16373d0 00:12:05.726 [2024-07-15 07:47:50.479446] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:05.986 "name": "raid_bdev1", 00:12:05.986 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:05.986 "strip_size_kb": 0, 00:12:05.986 "state": "online", 00:12:05.986 "raid_level": "raid1", 00:12:05.986 "superblock": true, 00:12:05.986 "num_base_bdevs": 2, 00:12:05.986 "num_base_bdevs_discovered": 2, 00:12:05.986 "num_base_bdevs_operational": 2, 00:12:05.986 "base_bdevs_list": [ 00:12:05.986 { 00:12:05.986 "name": "pt1", 00:12:05.986 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:05.986 "is_configured": true, 00:12:05.986 "data_offset": 2048, 00:12:05.986 "data_size": 63488 00:12:05.986 }, 00:12:05.986 { 00:12:05.986 "name": "pt2", 00:12:05.986 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:05.986 "is_configured": true, 00:12:05.986 "data_offset": 2048, 00:12:05.986 "data_size": 63488 00:12:05.986 } 00:12:05.986 ] 00:12:05.986 }' 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:05.986 07:47:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.555 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:06.555 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:06.555 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:06.555 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:06.555 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:06.555 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:06.555 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:06.555 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:06.815 [2024-07-15 07:47:51.404579] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:06.815 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:06.815 "name": "raid_bdev1", 00:12:06.815 "aliases": [ 00:12:06.815 "20bac51d-fb1a-44c4-a4e5-00f435c1686b" 00:12:06.815 ], 00:12:06.815 "product_name": "Raid Volume", 00:12:06.815 "block_size": 512, 00:12:06.815 "num_blocks": 63488, 00:12:06.815 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:06.815 "assigned_rate_limits": { 00:12:06.815 "rw_ios_per_sec": 0, 00:12:06.815 "rw_mbytes_per_sec": 0, 00:12:06.815 "r_mbytes_per_sec": 0, 00:12:06.815 "w_mbytes_per_sec": 0 00:12:06.815 }, 00:12:06.815 "claimed": false, 00:12:06.815 "zoned": false, 00:12:06.815 "supported_io_types": { 00:12:06.815 "read": true, 00:12:06.815 "write": true, 00:12:06.815 "unmap": false, 00:12:06.815 "flush": false, 00:12:06.815 "reset": true, 00:12:06.815 "nvme_admin": false, 00:12:06.815 "nvme_io": false, 00:12:06.815 "nvme_io_md": false, 00:12:06.815 "write_zeroes": true, 00:12:06.815 "zcopy": false, 00:12:06.815 "get_zone_info": false, 00:12:06.815 "zone_management": false, 00:12:06.815 "zone_append": false, 00:12:06.815 "compare": false, 00:12:06.815 "compare_and_write": false, 00:12:06.815 "abort": false, 00:12:06.815 "seek_hole": false, 00:12:06.815 "seek_data": false, 00:12:06.815 "copy": false, 00:12:06.815 "nvme_iov_md": false 00:12:06.815 }, 00:12:06.815 "memory_domains": [ 00:12:06.815 { 00:12:06.815 "dma_device_id": "system", 00:12:06.815 "dma_device_type": 1 00:12:06.815 }, 00:12:06.815 { 00:12:06.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:06.815 "dma_device_type": 2 00:12:06.815 }, 00:12:06.815 { 00:12:06.815 "dma_device_id": "system", 00:12:06.815 "dma_device_type": 1 00:12:06.815 }, 00:12:06.815 { 00:12:06.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:06.815 "dma_device_type": 2 00:12:06.815 } 00:12:06.815 ], 00:12:06.815 "driver_specific": { 00:12:06.815 "raid": { 00:12:06.815 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:06.815 "strip_size_kb": 0, 00:12:06.815 "state": "online", 00:12:06.815 "raid_level": "raid1", 00:12:06.815 "superblock": true, 00:12:06.815 "num_base_bdevs": 2, 00:12:06.815 "num_base_bdevs_discovered": 2, 00:12:06.815 "num_base_bdevs_operational": 2, 00:12:06.815 "base_bdevs_list": [ 00:12:06.815 { 00:12:06.815 "name": "pt1", 00:12:06.815 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:06.815 "is_configured": true, 00:12:06.815 "data_offset": 2048, 00:12:06.815 "data_size": 63488 00:12:06.815 }, 00:12:06.815 { 00:12:06.815 "name": "pt2", 00:12:06.815 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:06.815 "is_configured": true, 00:12:06.815 "data_offset": 2048, 00:12:06.815 "data_size": 63488 00:12:06.815 } 00:12:06.815 ] 00:12:06.815 } 00:12:06.815 } 00:12:06.815 }' 00:12:06.815 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:06.815 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:06.815 pt2' 00:12:06.815 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:06.815 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:06.815 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:07.075 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:07.075 "name": "pt1", 00:12:07.075 "aliases": [ 00:12:07.075 "00000000-0000-0000-0000-000000000001" 00:12:07.075 ], 00:12:07.075 "product_name": "passthru", 00:12:07.075 "block_size": 512, 00:12:07.075 "num_blocks": 65536, 00:12:07.075 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:07.075 "assigned_rate_limits": { 00:12:07.075 "rw_ios_per_sec": 0, 00:12:07.075 "rw_mbytes_per_sec": 0, 00:12:07.075 "r_mbytes_per_sec": 0, 00:12:07.075 "w_mbytes_per_sec": 0 00:12:07.075 }, 00:12:07.075 "claimed": true, 00:12:07.075 "claim_type": "exclusive_write", 00:12:07.075 "zoned": false, 00:12:07.075 "supported_io_types": { 00:12:07.075 "read": true, 00:12:07.075 "write": true, 00:12:07.075 "unmap": true, 00:12:07.075 "flush": true, 00:12:07.075 "reset": true, 00:12:07.075 "nvme_admin": false, 00:12:07.075 "nvme_io": false, 00:12:07.075 "nvme_io_md": false, 00:12:07.075 "write_zeroes": true, 00:12:07.075 "zcopy": true, 00:12:07.075 "get_zone_info": false, 00:12:07.075 "zone_management": false, 00:12:07.075 "zone_append": false, 00:12:07.075 "compare": false, 00:12:07.075 "compare_and_write": false, 00:12:07.075 "abort": true, 00:12:07.075 "seek_hole": false, 00:12:07.075 "seek_data": false, 00:12:07.075 "copy": true, 00:12:07.075 "nvme_iov_md": false 00:12:07.075 }, 00:12:07.075 "memory_domains": [ 00:12:07.075 { 00:12:07.075 "dma_device_id": "system", 00:12:07.075 "dma_device_type": 1 00:12:07.075 }, 00:12:07.075 { 00:12:07.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.075 "dma_device_type": 2 00:12:07.075 } 00:12:07.075 ], 00:12:07.075 "driver_specific": { 00:12:07.075 "passthru": { 00:12:07.075 "name": "pt1", 00:12:07.075 "base_bdev_name": "malloc1" 00:12:07.075 } 00:12:07.075 } 00:12:07.075 }' 00:12:07.075 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:07.075 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:07.075 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:07.075 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:07.076 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:07.076 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:07.076 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:07.335 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:07.335 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:07.335 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:07.335 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:07.335 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:07.335 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:07.336 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:07.336 07:47:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:07.595 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:07.595 "name": "pt2", 00:12:07.595 "aliases": [ 00:12:07.595 "00000000-0000-0000-0000-000000000002" 00:12:07.595 ], 00:12:07.595 "product_name": "passthru", 00:12:07.595 "block_size": 512, 00:12:07.595 "num_blocks": 65536, 00:12:07.595 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:07.595 "assigned_rate_limits": { 00:12:07.595 "rw_ios_per_sec": 0, 00:12:07.595 "rw_mbytes_per_sec": 0, 00:12:07.595 "r_mbytes_per_sec": 0, 00:12:07.595 "w_mbytes_per_sec": 0 00:12:07.595 }, 00:12:07.595 "claimed": true, 00:12:07.595 "claim_type": "exclusive_write", 00:12:07.595 "zoned": false, 00:12:07.595 "supported_io_types": { 00:12:07.595 "read": true, 00:12:07.595 "write": true, 00:12:07.596 "unmap": true, 00:12:07.596 "flush": true, 00:12:07.596 "reset": true, 00:12:07.596 "nvme_admin": false, 00:12:07.596 "nvme_io": false, 00:12:07.596 "nvme_io_md": false, 00:12:07.596 "write_zeroes": true, 00:12:07.596 "zcopy": true, 00:12:07.596 "get_zone_info": false, 00:12:07.596 "zone_management": false, 00:12:07.596 "zone_append": false, 00:12:07.596 "compare": false, 00:12:07.596 "compare_and_write": false, 00:12:07.596 "abort": true, 00:12:07.596 "seek_hole": false, 00:12:07.596 "seek_data": false, 00:12:07.596 "copy": true, 00:12:07.596 "nvme_iov_md": false 00:12:07.596 }, 00:12:07.596 "memory_domains": [ 00:12:07.596 { 00:12:07.596 "dma_device_id": "system", 00:12:07.596 "dma_device_type": 1 00:12:07.596 }, 00:12:07.596 { 00:12:07.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.596 "dma_device_type": 2 00:12:07.596 } 00:12:07.596 ], 00:12:07.596 "driver_specific": { 00:12:07.596 "passthru": { 00:12:07.596 "name": "pt2", 00:12:07.596 "base_bdev_name": "malloc2" 00:12:07.596 } 00:12:07.596 } 00:12:07.596 }' 00:12:07.596 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:07.596 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:07.596 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:07.596 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:07.596 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:07.856 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:08.116 [2024-07-15 07:47:52.707876] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:08.116 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=20bac51d-fb1a-44c4-a4e5-00f435c1686b 00:12:08.116 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 20bac51d-fb1a-44c4-a4e5-00f435c1686b ']' 00:12:08.116 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:08.377 [2024-07-15 07:47:52.900161] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:08.377 [2024-07-15 07:47:52.900172] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:08.377 [2024-07-15 07:47:52.900208] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:08.377 [2024-07-15 07:47:52.900249] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:08.377 [2024-07-15 07:47:52.900255] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16373d0 name raid_bdev1, state offline 00:12:08.377 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.377 07:47:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:08.377 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:08.377 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:08.377 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:08.377 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:08.636 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:08.637 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:08.897 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:09.157 [2024-07-15 07:47:53.826470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:09.157 [2024-07-15 07:47:53.827533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:09.157 [2024-07-15 07:47:53.827575] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:09.157 [2024-07-15 07:47:53.827602] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:09.157 [2024-07-15 07:47:53.827613] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:09.157 [2024-07-15 07:47:53.827619] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1635960 name raid_bdev1, state configuring 00:12:09.157 request: 00:12:09.157 { 00:12:09.157 "name": "raid_bdev1", 00:12:09.157 "raid_level": "raid1", 00:12:09.157 "base_bdevs": [ 00:12:09.157 "malloc1", 00:12:09.157 "malloc2" 00:12:09.157 ], 00:12:09.158 "superblock": false, 00:12:09.158 "method": "bdev_raid_create", 00:12:09.158 "req_id": 1 00:12:09.158 } 00:12:09.158 Got JSON-RPC error response 00:12:09.158 response: 00:12:09.158 { 00:12:09.158 "code": -17, 00:12:09.158 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:09.158 } 00:12:09.158 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:09.158 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:09.158 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:09.158 07:47:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:09.158 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.158 07:47:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:09.418 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:09.418 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:09.418 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:09.700 [2024-07-15 07:47:54.207390] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:09.700 [2024-07-15 07:47:54.207413] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.700 [2024-07-15 07:47:54.207426] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1636ba0 00:12:09.700 [2024-07-15 07:47:54.207432] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.700 [2024-07-15 07:47:54.208695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.700 [2024-07-15 07:47:54.208722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:09.700 [2024-07-15 07:47:54.208767] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:09.700 [2024-07-15 07:47:54.208785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:09.700 pt1 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.700 "name": "raid_bdev1", 00:12:09.700 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:09.700 "strip_size_kb": 0, 00:12:09.700 "state": "configuring", 00:12:09.700 "raid_level": "raid1", 00:12:09.700 "superblock": true, 00:12:09.700 "num_base_bdevs": 2, 00:12:09.700 "num_base_bdevs_discovered": 1, 00:12:09.700 "num_base_bdevs_operational": 2, 00:12:09.700 "base_bdevs_list": [ 00:12:09.700 { 00:12:09.700 "name": "pt1", 00:12:09.700 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:09.700 "is_configured": true, 00:12:09.700 "data_offset": 2048, 00:12:09.700 "data_size": 63488 00:12:09.700 }, 00:12:09.700 { 00:12:09.700 "name": null, 00:12:09.700 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:09.700 "is_configured": false, 00:12:09.700 "data_offset": 2048, 00:12:09.700 "data_size": 63488 00:12:09.700 } 00:12:09.700 ] 00:12:09.700 }' 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.700 07:47:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.290 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:10.290 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:10.290 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:10.290 07:47:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:10.550 [2024-07-15 07:47:55.109673] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:10.550 [2024-07-15 07:47:55.109704] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:10.550 [2024-07-15 07:47:55.109723] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148be00 00:12:10.550 [2024-07-15 07:47:55.109730] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:10.550 [2024-07-15 07:47:55.109995] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:10.550 [2024-07-15 07:47:55.110006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:10.550 [2024-07-15 07:47:55.110045] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:10.550 [2024-07-15 07:47:55.110057] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:10.550 [2024-07-15 07:47:55.110133] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1489820 00:12:10.550 [2024-07-15 07:47:55.110139] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:10.550 [2024-07-15 07:47:55.110281] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1638e90 00:12:10.550 [2024-07-15 07:47:55.110379] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1489820 00:12:10.551 [2024-07-15 07:47:55.110384] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1489820 00:12:10.551 [2024-07-15 07:47:55.110457] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:10.551 pt2 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.551 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:10.811 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:10.811 "name": "raid_bdev1", 00:12:10.811 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:10.811 "strip_size_kb": 0, 00:12:10.811 "state": "online", 00:12:10.811 "raid_level": "raid1", 00:12:10.811 "superblock": true, 00:12:10.811 "num_base_bdevs": 2, 00:12:10.811 "num_base_bdevs_discovered": 2, 00:12:10.811 "num_base_bdevs_operational": 2, 00:12:10.811 "base_bdevs_list": [ 00:12:10.811 { 00:12:10.811 "name": "pt1", 00:12:10.811 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:10.811 "is_configured": true, 00:12:10.811 "data_offset": 2048, 00:12:10.811 "data_size": 63488 00:12:10.811 }, 00:12:10.811 { 00:12:10.811 "name": "pt2", 00:12:10.811 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:10.811 "is_configured": true, 00:12:10.811 "data_offset": 2048, 00:12:10.811 "data_size": 63488 00:12:10.811 } 00:12:10.811 ] 00:12:10.811 }' 00:12:10.811 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:10.811 07:47:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.380 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:11.380 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:11.380 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:11.380 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:11.380 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:11.380 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:11.380 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:11.380 07:47:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:11.380 [2024-07-15 07:47:56.040234] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:11.380 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:11.380 "name": "raid_bdev1", 00:12:11.380 "aliases": [ 00:12:11.380 "20bac51d-fb1a-44c4-a4e5-00f435c1686b" 00:12:11.380 ], 00:12:11.380 "product_name": "Raid Volume", 00:12:11.380 "block_size": 512, 00:12:11.380 "num_blocks": 63488, 00:12:11.380 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:11.380 "assigned_rate_limits": { 00:12:11.380 "rw_ios_per_sec": 0, 00:12:11.380 "rw_mbytes_per_sec": 0, 00:12:11.380 "r_mbytes_per_sec": 0, 00:12:11.380 "w_mbytes_per_sec": 0 00:12:11.380 }, 00:12:11.380 "claimed": false, 00:12:11.380 "zoned": false, 00:12:11.380 "supported_io_types": { 00:12:11.380 "read": true, 00:12:11.380 "write": true, 00:12:11.380 "unmap": false, 00:12:11.380 "flush": false, 00:12:11.380 "reset": true, 00:12:11.380 "nvme_admin": false, 00:12:11.380 "nvme_io": false, 00:12:11.380 "nvme_io_md": false, 00:12:11.380 "write_zeroes": true, 00:12:11.380 "zcopy": false, 00:12:11.380 "get_zone_info": false, 00:12:11.380 "zone_management": false, 00:12:11.380 "zone_append": false, 00:12:11.380 "compare": false, 00:12:11.380 "compare_and_write": false, 00:12:11.380 "abort": false, 00:12:11.380 "seek_hole": false, 00:12:11.380 "seek_data": false, 00:12:11.380 "copy": false, 00:12:11.380 "nvme_iov_md": false 00:12:11.380 }, 00:12:11.380 "memory_domains": [ 00:12:11.380 { 00:12:11.380 "dma_device_id": "system", 00:12:11.380 "dma_device_type": 1 00:12:11.380 }, 00:12:11.380 { 00:12:11.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.380 "dma_device_type": 2 00:12:11.380 }, 00:12:11.380 { 00:12:11.380 "dma_device_id": "system", 00:12:11.380 "dma_device_type": 1 00:12:11.380 }, 00:12:11.380 { 00:12:11.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.380 "dma_device_type": 2 00:12:11.380 } 00:12:11.380 ], 00:12:11.381 "driver_specific": { 00:12:11.381 "raid": { 00:12:11.381 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:11.381 "strip_size_kb": 0, 00:12:11.381 "state": "online", 00:12:11.381 "raid_level": "raid1", 00:12:11.381 "superblock": true, 00:12:11.381 "num_base_bdevs": 2, 00:12:11.381 "num_base_bdevs_discovered": 2, 00:12:11.381 "num_base_bdevs_operational": 2, 00:12:11.381 "base_bdevs_list": [ 00:12:11.381 { 00:12:11.381 "name": "pt1", 00:12:11.381 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:11.381 "is_configured": true, 00:12:11.381 "data_offset": 2048, 00:12:11.381 "data_size": 63488 00:12:11.381 }, 00:12:11.381 { 00:12:11.381 "name": "pt2", 00:12:11.381 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:11.381 "is_configured": true, 00:12:11.381 "data_offset": 2048, 00:12:11.381 "data_size": 63488 00:12:11.381 } 00:12:11.381 ] 00:12:11.381 } 00:12:11.381 } 00:12:11.381 }' 00:12:11.381 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:11.381 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:11.381 pt2' 00:12:11.381 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:11.381 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:11.381 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:11.640 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:11.640 "name": "pt1", 00:12:11.640 "aliases": [ 00:12:11.640 "00000000-0000-0000-0000-000000000001" 00:12:11.640 ], 00:12:11.640 "product_name": "passthru", 00:12:11.640 "block_size": 512, 00:12:11.640 "num_blocks": 65536, 00:12:11.640 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:11.640 "assigned_rate_limits": { 00:12:11.640 "rw_ios_per_sec": 0, 00:12:11.640 "rw_mbytes_per_sec": 0, 00:12:11.640 "r_mbytes_per_sec": 0, 00:12:11.640 "w_mbytes_per_sec": 0 00:12:11.640 }, 00:12:11.640 "claimed": true, 00:12:11.640 "claim_type": "exclusive_write", 00:12:11.640 "zoned": false, 00:12:11.640 "supported_io_types": { 00:12:11.640 "read": true, 00:12:11.640 "write": true, 00:12:11.640 "unmap": true, 00:12:11.640 "flush": true, 00:12:11.640 "reset": true, 00:12:11.640 "nvme_admin": false, 00:12:11.640 "nvme_io": false, 00:12:11.640 "nvme_io_md": false, 00:12:11.640 "write_zeroes": true, 00:12:11.640 "zcopy": true, 00:12:11.640 "get_zone_info": false, 00:12:11.640 "zone_management": false, 00:12:11.640 "zone_append": false, 00:12:11.640 "compare": false, 00:12:11.640 "compare_and_write": false, 00:12:11.640 "abort": true, 00:12:11.640 "seek_hole": false, 00:12:11.640 "seek_data": false, 00:12:11.640 "copy": true, 00:12:11.640 "nvme_iov_md": false 00:12:11.640 }, 00:12:11.640 "memory_domains": [ 00:12:11.640 { 00:12:11.640 "dma_device_id": "system", 00:12:11.640 "dma_device_type": 1 00:12:11.640 }, 00:12:11.640 { 00:12:11.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.640 "dma_device_type": 2 00:12:11.640 } 00:12:11.640 ], 00:12:11.640 "driver_specific": { 00:12:11.640 "passthru": { 00:12:11.640 "name": "pt1", 00:12:11.640 "base_bdev_name": "malloc1" 00:12:11.640 } 00:12:11.640 } 00:12:11.640 }' 00:12:11.640 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.640 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.640 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:11.640 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:11.899 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:12.159 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:12.159 "name": "pt2", 00:12:12.159 "aliases": [ 00:12:12.159 "00000000-0000-0000-0000-000000000002" 00:12:12.159 ], 00:12:12.159 "product_name": "passthru", 00:12:12.159 "block_size": 512, 00:12:12.159 "num_blocks": 65536, 00:12:12.159 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:12.159 "assigned_rate_limits": { 00:12:12.159 "rw_ios_per_sec": 0, 00:12:12.159 "rw_mbytes_per_sec": 0, 00:12:12.159 "r_mbytes_per_sec": 0, 00:12:12.159 "w_mbytes_per_sec": 0 00:12:12.159 }, 00:12:12.159 "claimed": true, 00:12:12.159 "claim_type": "exclusive_write", 00:12:12.159 "zoned": false, 00:12:12.159 "supported_io_types": { 00:12:12.159 "read": true, 00:12:12.159 "write": true, 00:12:12.159 "unmap": true, 00:12:12.159 "flush": true, 00:12:12.159 "reset": true, 00:12:12.159 "nvme_admin": false, 00:12:12.159 "nvme_io": false, 00:12:12.159 "nvme_io_md": false, 00:12:12.159 "write_zeroes": true, 00:12:12.159 "zcopy": true, 00:12:12.159 "get_zone_info": false, 00:12:12.159 "zone_management": false, 00:12:12.159 "zone_append": false, 00:12:12.159 "compare": false, 00:12:12.159 "compare_and_write": false, 00:12:12.159 "abort": true, 00:12:12.159 "seek_hole": false, 00:12:12.159 "seek_data": false, 00:12:12.159 "copy": true, 00:12:12.159 "nvme_iov_md": false 00:12:12.159 }, 00:12:12.159 "memory_domains": [ 00:12:12.159 { 00:12:12.159 "dma_device_id": "system", 00:12:12.159 "dma_device_type": 1 00:12:12.159 }, 00:12:12.159 { 00:12:12.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.159 "dma_device_type": 2 00:12:12.159 } 00:12:12.159 ], 00:12:12.159 "driver_specific": { 00:12:12.159 "passthru": { 00:12:12.159 "name": "pt2", 00:12:12.159 "base_bdev_name": "malloc2" 00:12:12.160 } 00:12:12.160 } 00:12:12.160 }' 00:12:12.160 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:12.160 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:12.419 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:12.419 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:12.419 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:12.419 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:12.419 07:47:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:12.419 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:12.419 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:12.419 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:12.419 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:12.419 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:12.680 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:12.680 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:12.680 [2024-07-15 07:47:57.319472] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:12.680 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 20bac51d-fb1a-44c4-a4e5-00f435c1686b '!=' 20bac51d-fb1a-44c4-a4e5-00f435c1686b ']' 00:12:12.680 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:12:12.680 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:12.680 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:12.680 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:12.940 [2024-07-15 07:47:57.515781] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.940 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:13.200 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.200 "name": "raid_bdev1", 00:12:13.200 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:13.200 "strip_size_kb": 0, 00:12:13.200 "state": "online", 00:12:13.200 "raid_level": "raid1", 00:12:13.200 "superblock": true, 00:12:13.200 "num_base_bdevs": 2, 00:12:13.200 "num_base_bdevs_discovered": 1, 00:12:13.200 "num_base_bdevs_operational": 1, 00:12:13.200 "base_bdevs_list": [ 00:12:13.200 { 00:12:13.200 "name": null, 00:12:13.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:13.200 "is_configured": false, 00:12:13.200 "data_offset": 2048, 00:12:13.200 "data_size": 63488 00:12:13.200 }, 00:12:13.200 { 00:12:13.200 "name": "pt2", 00:12:13.200 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:13.200 "is_configured": true, 00:12:13.200 "data_offset": 2048, 00:12:13.200 "data_size": 63488 00:12:13.200 } 00:12:13.200 ] 00:12:13.200 }' 00:12:13.200 07:47:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.200 07:47:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.770 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:13.770 [2024-07-15 07:47:58.446115] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:13.771 [2024-07-15 07:47:58.446131] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:13.771 [2024-07-15 07:47:58.446164] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:13.771 [2024-07-15 07:47:58.446190] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:13.771 [2024-07-15 07:47:58.446196] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1489820 name raid_bdev1, state offline 00:12:13.771 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.771 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:12:14.031 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:12:14.031 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:12:14.031 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:12:14.031 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:14.032 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:14.032 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:12:14.032 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:14.032 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:12:14.032 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:12:14.032 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:12:14.032 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:14.293 [2024-07-15 07:47:58.927317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:14.293 [2024-07-15 07:47:58.927343] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:14.293 [2024-07-15 07:47:58.927353] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1633d70 00:12:14.293 [2024-07-15 07:47:58.927360] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:14.293 [2024-07-15 07:47:58.928627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:14.293 [2024-07-15 07:47:58.928646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:14.293 [2024-07-15 07:47:58.928690] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:14.293 [2024-07-15 07:47:58.928708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:14.293 [2024-07-15 07:47:58.928779] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1638ae0 00:12:14.293 [2024-07-15 07:47:58.928785] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:14.293 [2024-07-15 07:47:58.928924] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1636160 00:12:14.293 [2024-07-15 07:47:58.929018] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1638ae0 00:12:14.293 [2024-07-15 07:47:58.929023] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1638ae0 00:12:14.293 [2024-07-15 07:47:58.929092] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:14.293 pt2 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.293 07:47:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:14.554 07:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.554 "name": "raid_bdev1", 00:12:14.554 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:14.554 "strip_size_kb": 0, 00:12:14.554 "state": "online", 00:12:14.554 "raid_level": "raid1", 00:12:14.554 "superblock": true, 00:12:14.554 "num_base_bdevs": 2, 00:12:14.554 "num_base_bdevs_discovered": 1, 00:12:14.554 "num_base_bdevs_operational": 1, 00:12:14.554 "base_bdevs_list": [ 00:12:14.554 { 00:12:14.554 "name": null, 00:12:14.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.554 "is_configured": false, 00:12:14.554 "data_offset": 2048, 00:12:14.554 "data_size": 63488 00:12:14.554 }, 00:12:14.554 { 00:12:14.554 "name": "pt2", 00:12:14.554 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:14.554 "is_configured": true, 00:12:14.554 "data_offset": 2048, 00:12:14.554 "data_size": 63488 00:12:14.554 } 00:12:14.554 ] 00:12:14.554 }' 00:12:14.554 07:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.554 07:47:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.125 07:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:15.125 [2024-07-15 07:47:59.853652] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:15.125 [2024-07-15 07:47:59.853669] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:15.125 [2024-07-15 07:47:59.853701] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:15.125 [2024-07-15 07:47:59.853733] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:15.125 [2024-07-15 07:47:59.853739] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1638ae0 name raid_bdev1, state offline 00:12:15.125 07:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.125 07:47:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:15.385 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:15.385 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:15.385 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:15.385 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:15.645 [2024-07-15 07:48:00.226874] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:15.645 [2024-07-15 07:48:00.226909] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:15.645 [2024-07-15 07:48:00.226920] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148b3e0 00:12:15.645 [2024-07-15 07:48:00.226927] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:15.645 [2024-07-15 07:48:00.228203] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:15.645 [2024-07-15 07:48:00.228222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:15.645 [2024-07-15 07:48:00.228268] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:15.645 [2024-07-15 07:48:00.228286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:15.645 [2024-07-15 07:48:00.228361] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:15.645 [2024-07-15 07:48:00.228368] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:15.645 [2024-07-15 07:48:00.228377] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1638fa0 name raid_bdev1, state configuring 00:12:15.645 [2024-07-15 07:48:00.228391] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:15.645 [2024-07-15 07:48:00.228430] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1638fa0 00:12:15.645 [2024-07-15 07:48:00.228441] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:15.645 [2024-07-15 07:48:00.228583] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1635850 00:12:15.645 [2024-07-15 07:48:00.228677] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1638fa0 00:12:15.645 [2024-07-15 07:48:00.228682] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1638fa0 00:12:15.645 [2024-07-15 07:48:00.228764] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:15.645 pt1 00:12:15.645 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.646 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:15.907 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.907 "name": "raid_bdev1", 00:12:15.907 "uuid": "20bac51d-fb1a-44c4-a4e5-00f435c1686b", 00:12:15.907 "strip_size_kb": 0, 00:12:15.907 "state": "online", 00:12:15.907 "raid_level": "raid1", 00:12:15.907 "superblock": true, 00:12:15.907 "num_base_bdevs": 2, 00:12:15.907 "num_base_bdevs_discovered": 1, 00:12:15.907 "num_base_bdevs_operational": 1, 00:12:15.907 "base_bdevs_list": [ 00:12:15.907 { 00:12:15.908 "name": null, 00:12:15.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.908 "is_configured": false, 00:12:15.908 "data_offset": 2048, 00:12:15.908 "data_size": 63488 00:12:15.908 }, 00:12:15.908 { 00:12:15.908 "name": "pt2", 00:12:15.908 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:15.908 "is_configured": true, 00:12:15.908 "data_offset": 2048, 00:12:15.908 "data_size": 63488 00:12:15.908 } 00:12:15.908 ] 00:12:15.908 }' 00:12:15.908 07:48:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.908 07:48:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.478 07:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:16.478 07:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:16.478 07:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:16.739 07:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:16.739 07:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:16.998 [2024-07-15 07:48:01.742883] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 20bac51d-fb1a-44c4-a4e5-00f435c1686b '!=' 20bac51d-fb1a-44c4-a4e5-00f435c1686b ']' 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1603940 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1603940 ']' 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1603940 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1603940 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1603940' 00:12:17.260 killing process with pid 1603940 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1603940 00:12:17.260 [2024-07-15 07:48:01.829568] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:17.260 [2024-07-15 07:48:01.829606] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:17.260 [2024-07-15 07:48:01.829636] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:17.260 [2024-07-15 07:48:01.829642] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1638fa0 name raid_bdev1, state offline 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1603940 00:12:17.260 [2024-07-15 07:48:01.838964] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:17.260 00:12:17.260 real 0m13.349s 00:12:17.260 user 0m24.768s 00:12:17.260 sys 0m1.999s 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:17.260 07:48:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.260 ************************************ 00:12:17.260 END TEST raid_superblock_test 00:12:17.260 ************************************ 00:12:17.260 07:48:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:17.260 07:48:02 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:17.260 07:48:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:17.260 07:48:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:17.260 07:48:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:17.521 ************************************ 00:12:17.521 START TEST raid_read_error_test 00:12:17.521 ************************************ 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZLO0xCUUY0 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1606535 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1606535 /var/tmp/spdk-raid.sock 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1606535 ']' 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:17.521 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:17.521 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.521 [2024-07-15 07:48:02.108413] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:12:17.521 [2024-07-15 07:48:02.108471] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606535 ] 00:12:17.522 [2024-07-15 07:48:02.198256] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:17.522 [2024-07-15 07:48:02.273920] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:17.783 [2024-07-15 07:48:02.320836] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:17.783 [2024-07-15 07:48:02.320859] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.353 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:18.353 07:48:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:18.353 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:18.353 07:48:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:18.613 BaseBdev1_malloc 00:12:18.613 07:48:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:18.613 true 00:12:18.613 07:48:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:18.872 [2024-07-15 07:48:03.532941] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:18.872 [2024-07-15 07:48:03.532972] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:18.872 [2024-07-15 07:48:03.532983] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x294ab50 00:12:18.872 [2024-07-15 07:48:03.532990] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:18.872 [2024-07-15 07:48:03.534311] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:18.872 [2024-07-15 07:48:03.534330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:18.872 BaseBdev1 00:12:18.872 07:48:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:18.872 07:48:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:19.132 BaseBdev2_malloc 00:12:19.132 07:48:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:19.392 true 00:12:19.392 07:48:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:19.392 [2024-07-15 07:48:04.092268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:19.392 [2024-07-15 07:48:04.092299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:19.392 [2024-07-15 07:48:04.092311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x292eea0 00:12:19.392 [2024-07-15 07:48:04.092317] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:19.392 [2024-07-15 07:48:04.093500] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:19.392 [2024-07-15 07:48:04.093518] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:19.392 BaseBdev2 00:12:19.392 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:19.653 [2024-07-15 07:48:04.280761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:19.653 [2024-07-15 07:48:04.281766] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:19.653 [2024-07-15 07:48:04.281906] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2798360 00:12:19.653 [2024-07-15 07:48:04.281914] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:19.653 [2024-07-15 07:48:04.282057] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2795a00 00:12:19.653 [2024-07-15 07:48:04.282170] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2798360 00:12:19.653 [2024-07-15 07:48:04.282176] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2798360 00:12:19.653 [2024-07-15 07:48:04.282249] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:19.653 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.913 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.913 "name": "raid_bdev1", 00:12:19.913 "uuid": "441d2920-e443-4897-8283-aa0260863496", 00:12:19.913 "strip_size_kb": 0, 00:12:19.913 "state": "online", 00:12:19.913 "raid_level": "raid1", 00:12:19.913 "superblock": true, 00:12:19.913 "num_base_bdevs": 2, 00:12:19.913 "num_base_bdevs_discovered": 2, 00:12:19.913 "num_base_bdevs_operational": 2, 00:12:19.913 "base_bdevs_list": [ 00:12:19.913 { 00:12:19.913 "name": "BaseBdev1", 00:12:19.913 "uuid": "de1577a1-dd19-556e-a930-f7dea18aacbc", 00:12:19.913 "is_configured": true, 00:12:19.913 "data_offset": 2048, 00:12:19.913 "data_size": 63488 00:12:19.913 }, 00:12:19.913 { 00:12:19.913 "name": "BaseBdev2", 00:12:19.913 "uuid": "1fa8b195-3215-59e1-93ca-b28dc600b649", 00:12:19.913 "is_configured": true, 00:12:19.913 "data_offset": 2048, 00:12:19.913 "data_size": 63488 00:12:19.913 } 00:12:19.913 ] 00:12:19.913 }' 00:12:19.913 07:48:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.913 07:48:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.483 07:48:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:20.483 07:48:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:20.483 [2024-07-15 07:48:05.155189] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x29302a0 00:12:21.422 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.682 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:21.941 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:21.941 "name": "raid_bdev1", 00:12:21.941 "uuid": "441d2920-e443-4897-8283-aa0260863496", 00:12:21.941 "strip_size_kb": 0, 00:12:21.941 "state": "online", 00:12:21.941 "raid_level": "raid1", 00:12:21.941 "superblock": true, 00:12:21.941 "num_base_bdevs": 2, 00:12:21.941 "num_base_bdevs_discovered": 2, 00:12:21.941 "num_base_bdevs_operational": 2, 00:12:21.941 "base_bdevs_list": [ 00:12:21.941 { 00:12:21.941 "name": "BaseBdev1", 00:12:21.941 "uuid": "de1577a1-dd19-556e-a930-f7dea18aacbc", 00:12:21.941 "is_configured": true, 00:12:21.941 "data_offset": 2048, 00:12:21.941 "data_size": 63488 00:12:21.941 }, 00:12:21.941 { 00:12:21.941 "name": "BaseBdev2", 00:12:21.941 "uuid": "1fa8b195-3215-59e1-93ca-b28dc600b649", 00:12:21.941 "is_configured": true, 00:12:21.941 "data_offset": 2048, 00:12:21.941 "data_size": 63488 00:12:21.941 } 00:12:21.941 ] 00:12:21.941 }' 00:12:21.941 07:48:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:21.941 07:48:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.509 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:22.509 [2024-07-15 07:48:07.195254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:22.509 [2024-07-15 07:48:07.195286] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:22.509 [2024-07-15 07:48:07.197844] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:22.509 [2024-07-15 07:48:07.197865] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:22.509 [2024-07-15 07:48:07.197921] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:22.509 [2024-07-15 07:48:07.197927] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2798360 name raid_bdev1, state offline 00:12:22.509 0 00:12:22.509 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1606535 00:12:22.509 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1606535 ']' 00:12:22.509 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1606535 00:12:22.509 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:22.509 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:22.509 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1606535 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1606535' 00:12:22.769 killing process with pid 1606535 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1606535 00:12:22.769 [2024-07-15 07:48:07.282455] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1606535 00:12:22.769 [2024-07-15 07:48:07.288036] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZLO0xCUUY0 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:22.769 00:12:22.769 real 0m5.380s 00:12:22.769 user 0m8.458s 00:12:22.769 sys 0m0.751s 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:22.769 07:48:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.769 ************************************ 00:12:22.769 END TEST raid_read_error_test 00:12:22.769 ************************************ 00:12:22.769 07:48:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:22.769 07:48:07 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:12:22.769 07:48:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:22.769 07:48:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:22.769 07:48:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:22.769 ************************************ 00:12:22.770 START TEST raid_write_error_test 00:12:22.770 ************************************ 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cDRMEwkSAK 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1607549 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1607549 /var/tmp/spdk-raid.sock 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1607549 ']' 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:22.770 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.770 07:48:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:23.029 [2024-07-15 07:48:07.555697] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:12:23.029 [2024-07-15 07:48:07.555757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1607549 ] 00:12:23.029 [2024-07-15 07:48:07.644037] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:23.029 [2024-07-15 07:48:07.720657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:23.029 [2024-07-15 07:48:07.766853] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:23.029 [2024-07-15 07:48:07.766873] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.412 07:48:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:24.412 07:48:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:24.412 07:48:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:24.412 07:48:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:24.672 BaseBdev1_malloc 00:12:24.672 07:48:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:25.242 true 00:12:25.242 07:48:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:25.813 [2024-07-15 07:48:10.342367] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:25.813 [2024-07-15 07:48:10.342404] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:25.813 [2024-07-15 07:48:10.342416] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1637b50 00:12:25.813 [2024-07-15 07:48:10.342423] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:25.813 [2024-07-15 07:48:10.343811] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:25.813 [2024-07-15 07:48:10.343832] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:25.813 BaseBdev1 00:12:25.813 07:48:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:25.813 07:48:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:26.383 BaseBdev2_malloc 00:12:26.383 07:48:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:26.993 true 00:12:26.993 07:48:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:27.255 [2024-07-15 07:48:11.968277] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:27.255 [2024-07-15 07:48:11.968306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:27.255 [2024-07-15 07:48:11.968319] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161bea0 00:12:27.255 [2024-07-15 07:48:11.968325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:27.255 [2024-07-15 07:48:11.969556] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:27.255 [2024-07-15 07:48:11.969577] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:27.255 BaseBdev2 00:12:27.255 07:48:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:27.827 [2024-07-15 07:48:12.509636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:27.827 [2024-07-15 07:48:12.510668] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:27.827 [2024-07-15 07:48:12.510815] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1485360 00:12:27.827 [2024-07-15 07:48:12.510824] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:27.827 [2024-07-15 07:48:12.510972] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1482a00 00:12:27.827 [2024-07-15 07:48:12.511088] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1485360 00:12:27.827 [2024-07-15 07:48:12.511093] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1485360 00:12:27.827 [2024-07-15 07:48:12.511171] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.827 07:48:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:28.396 07:48:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.396 "name": "raid_bdev1", 00:12:28.396 "uuid": "de725706-3632-4590-8206-fd1d432875e7", 00:12:28.396 "strip_size_kb": 0, 00:12:28.396 "state": "online", 00:12:28.396 "raid_level": "raid1", 00:12:28.396 "superblock": true, 00:12:28.396 "num_base_bdevs": 2, 00:12:28.396 "num_base_bdevs_discovered": 2, 00:12:28.396 "num_base_bdevs_operational": 2, 00:12:28.396 "base_bdevs_list": [ 00:12:28.396 { 00:12:28.396 "name": "BaseBdev1", 00:12:28.396 "uuid": "82dc76f2-3eca-5c68-bfcf-dd2e491cff53", 00:12:28.396 "is_configured": true, 00:12:28.396 "data_offset": 2048, 00:12:28.396 "data_size": 63488 00:12:28.396 }, 00:12:28.396 { 00:12:28.396 "name": "BaseBdev2", 00:12:28.396 "uuid": "cbc64783-928e-5cf6-95ba-a4efb4a14696", 00:12:28.396 "is_configured": true, 00:12:28.396 "data_offset": 2048, 00:12:28.396 "data_size": 63488 00:12:28.396 } 00:12:28.396 ] 00:12:28.396 }' 00:12:28.396 07:48:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.396 07:48:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.972 07:48:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:28.972 07:48:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:28.972 [2024-07-15 07:48:13.712898] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x161d2a0 00:12:29.908 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:30.167 [2024-07-15 07:48:14.801500] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:12:30.167 [2024-07-15 07:48:14.801547] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:30.167 [2024-07-15 07:48:14.801704] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x161d2a0 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.167 07:48:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:30.426 07:48:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.426 "name": "raid_bdev1", 00:12:30.426 "uuid": "de725706-3632-4590-8206-fd1d432875e7", 00:12:30.426 "strip_size_kb": 0, 00:12:30.426 "state": "online", 00:12:30.426 "raid_level": "raid1", 00:12:30.426 "superblock": true, 00:12:30.426 "num_base_bdevs": 2, 00:12:30.426 "num_base_bdevs_discovered": 1, 00:12:30.426 "num_base_bdevs_operational": 1, 00:12:30.426 "base_bdevs_list": [ 00:12:30.426 { 00:12:30.426 "name": null, 00:12:30.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.426 "is_configured": false, 00:12:30.426 "data_offset": 2048, 00:12:30.426 "data_size": 63488 00:12:30.426 }, 00:12:30.426 { 00:12:30.426 "name": "BaseBdev2", 00:12:30.426 "uuid": "cbc64783-928e-5cf6-95ba-a4efb4a14696", 00:12:30.426 "is_configured": true, 00:12:30.426 "data_offset": 2048, 00:12:30.426 "data_size": 63488 00:12:30.426 } 00:12:30.426 ] 00:12:30.426 }' 00:12:30.426 07:48:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.426 07:48:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.363 07:48:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:31.363 [2024-07-15 07:48:16.080731] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:31.363 [2024-07-15 07:48:16.080753] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:31.363 [2024-07-15 07:48:16.083290] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:31.363 [2024-07-15 07:48:16.083307] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:31.363 [2024-07-15 07:48:16.083342] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:31.363 [2024-07-15 07:48:16.083348] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1485360 name raid_bdev1, state offline 00:12:31.363 0 00:12:31.363 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1607549 00:12:31.363 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1607549 ']' 00:12:31.363 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1607549 00:12:31.363 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1607549 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1607549' 00:12:31.623 killing process with pid 1607549 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1607549 00:12:31.623 [2024-07-15 07:48:16.166964] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1607549 00:12:31.623 [2024-07-15 07:48:16.172180] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cDRMEwkSAK 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:31.623 00:12:31.623 real 0m8.815s 00:12:31.623 user 0m15.023s 00:12:31.623 sys 0m1.064s 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:31.623 07:48:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.623 ************************************ 00:12:31.623 END TEST raid_write_error_test 00:12:31.623 ************************************ 00:12:31.623 07:48:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:31.623 07:48:16 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:12:31.623 07:48:16 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:31.623 07:48:16 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:12:31.623 07:48:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:31.623 07:48:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:31.623 07:48:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:31.883 ************************************ 00:12:31.883 START TEST raid_state_function_test 00:12:31.883 ************************************ 00:12:31.883 07:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:12:31.883 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:31.883 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:31.883 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:31.883 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:31.883 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1609616 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1609616' 00:12:31.884 Process raid pid: 1609616 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1609616 /var/tmp/spdk-raid.sock 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1609616 ']' 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:31.884 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:31.884 07:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.884 [2024-07-15 07:48:16.440489] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:12:31.884 [2024-07-15 07:48:16.440530] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:31.884 [2024-07-15 07:48:16.527073] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:31.884 [2024-07-15 07:48:16.590656] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:31.884 [2024-07-15 07:48:16.634615] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:31.884 [2024-07-15 07:48:16.634636] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:32.824 [2024-07-15 07:48:17.417811] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:32.824 [2024-07-15 07:48:17.417843] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:32.824 [2024-07-15 07:48:17.417849] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:32.824 [2024-07-15 07:48:17.417855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:32.824 [2024-07-15 07:48:17.417860] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:32.824 [2024-07-15 07:48:17.417865] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.824 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.084 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.084 "name": "Existed_Raid", 00:12:33.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.085 "strip_size_kb": 64, 00:12:33.085 "state": "configuring", 00:12:33.085 "raid_level": "raid0", 00:12:33.085 "superblock": false, 00:12:33.085 "num_base_bdevs": 3, 00:12:33.085 "num_base_bdevs_discovered": 0, 00:12:33.085 "num_base_bdevs_operational": 3, 00:12:33.085 "base_bdevs_list": [ 00:12:33.085 { 00:12:33.085 "name": "BaseBdev1", 00:12:33.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.085 "is_configured": false, 00:12:33.085 "data_offset": 0, 00:12:33.085 "data_size": 0 00:12:33.085 }, 00:12:33.085 { 00:12:33.085 "name": "BaseBdev2", 00:12:33.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.085 "is_configured": false, 00:12:33.085 "data_offset": 0, 00:12:33.085 "data_size": 0 00:12:33.085 }, 00:12:33.085 { 00:12:33.085 "name": "BaseBdev3", 00:12:33.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.085 "is_configured": false, 00:12:33.085 "data_offset": 0, 00:12:33.085 "data_size": 0 00:12:33.085 } 00:12:33.085 ] 00:12:33.085 }' 00:12:33.085 07:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.085 07:48:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.655 07:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:33.655 [2024-07-15 07:48:18.372115] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:33.655 [2024-07-15 07:48:18.372131] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d16d0 name Existed_Raid, state configuring 00:12:33.655 07:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:33.916 [2024-07-15 07:48:18.556591] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:33.916 [2024-07-15 07:48:18.556608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:33.916 [2024-07-15 07:48:18.556613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:33.916 [2024-07-15 07:48:18.556618] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:33.916 [2024-07-15 07:48:18.556623] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:33.916 [2024-07-15 07:48:18.556628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:33.916 07:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:34.177 [2024-07-15 07:48:18.747590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:34.177 BaseBdev1 00:12:34.177 07:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:34.177 07:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:34.177 07:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:34.177 07:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:34.177 07:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:34.177 07:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:34.177 07:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:34.437 07:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:34.437 [ 00:12:34.437 { 00:12:34.437 "name": "BaseBdev1", 00:12:34.437 "aliases": [ 00:12:34.437 "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445" 00:12:34.437 ], 00:12:34.437 "product_name": "Malloc disk", 00:12:34.437 "block_size": 512, 00:12:34.437 "num_blocks": 65536, 00:12:34.437 "uuid": "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445", 00:12:34.437 "assigned_rate_limits": { 00:12:34.437 "rw_ios_per_sec": 0, 00:12:34.437 "rw_mbytes_per_sec": 0, 00:12:34.437 "r_mbytes_per_sec": 0, 00:12:34.437 "w_mbytes_per_sec": 0 00:12:34.437 }, 00:12:34.437 "claimed": true, 00:12:34.437 "claim_type": "exclusive_write", 00:12:34.437 "zoned": false, 00:12:34.437 "supported_io_types": { 00:12:34.437 "read": true, 00:12:34.437 "write": true, 00:12:34.437 "unmap": true, 00:12:34.437 "flush": true, 00:12:34.437 "reset": true, 00:12:34.437 "nvme_admin": false, 00:12:34.437 "nvme_io": false, 00:12:34.437 "nvme_io_md": false, 00:12:34.437 "write_zeroes": true, 00:12:34.437 "zcopy": true, 00:12:34.437 "get_zone_info": false, 00:12:34.437 "zone_management": false, 00:12:34.437 "zone_append": false, 00:12:34.437 "compare": false, 00:12:34.437 "compare_and_write": false, 00:12:34.437 "abort": true, 00:12:34.437 "seek_hole": false, 00:12:34.437 "seek_data": false, 00:12:34.437 "copy": true, 00:12:34.437 "nvme_iov_md": false 00:12:34.437 }, 00:12:34.437 "memory_domains": [ 00:12:34.437 { 00:12:34.437 "dma_device_id": "system", 00:12:34.437 "dma_device_type": 1 00:12:34.437 }, 00:12:34.437 { 00:12:34.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.437 "dma_device_type": 2 00:12:34.437 } 00:12:34.437 ], 00:12:34.437 "driver_specific": {} 00:12:34.437 } 00:12:34.437 ] 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.437 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.697 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.697 "name": "Existed_Raid", 00:12:34.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.697 "strip_size_kb": 64, 00:12:34.697 "state": "configuring", 00:12:34.697 "raid_level": "raid0", 00:12:34.697 "superblock": false, 00:12:34.697 "num_base_bdevs": 3, 00:12:34.697 "num_base_bdevs_discovered": 1, 00:12:34.697 "num_base_bdevs_operational": 3, 00:12:34.697 "base_bdevs_list": [ 00:12:34.697 { 00:12:34.697 "name": "BaseBdev1", 00:12:34.697 "uuid": "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445", 00:12:34.697 "is_configured": true, 00:12:34.698 "data_offset": 0, 00:12:34.698 "data_size": 65536 00:12:34.698 }, 00:12:34.698 { 00:12:34.698 "name": "BaseBdev2", 00:12:34.698 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.698 "is_configured": false, 00:12:34.698 "data_offset": 0, 00:12:34.698 "data_size": 0 00:12:34.698 }, 00:12:34.698 { 00:12:34.698 "name": "BaseBdev3", 00:12:34.698 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.698 "is_configured": false, 00:12:34.698 "data_offset": 0, 00:12:34.698 "data_size": 0 00:12:34.698 } 00:12:34.698 ] 00:12:34.698 }' 00:12:34.698 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.698 07:48:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:35.270 07:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:35.532 [2024-07-15 07:48:20.059248] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:35.532 [2024-07-15 07:48:20.059279] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d0fa0 name Existed_Raid, state configuring 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:35.532 [2024-07-15 07:48:20.255778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:35.532 [2024-07-15 07:48:20.256930] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:35.532 [2024-07-15 07:48:20.256956] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:35.532 [2024-07-15 07:48:20.256962] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:35.532 [2024-07-15 07:48:20.256968] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:35.532 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.794 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.794 "name": "Existed_Raid", 00:12:35.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.794 "strip_size_kb": 64, 00:12:35.794 "state": "configuring", 00:12:35.794 "raid_level": "raid0", 00:12:35.794 "superblock": false, 00:12:35.794 "num_base_bdevs": 3, 00:12:35.794 "num_base_bdevs_discovered": 1, 00:12:35.794 "num_base_bdevs_operational": 3, 00:12:35.794 "base_bdevs_list": [ 00:12:35.794 { 00:12:35.794 "name": "BaseBdev1", 00:12:35.794 "uuid": "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445", 00:12:35.794 "is_configured": true, 00:12:35.794 "data_offset": 0, 00:12:35.794 "data_size": 65536 00:12:35.794 }, 00:12:35.794 { 00:12:35.794 "name": "BaseBdev2", 00:12:35.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.794 "is_configured": false, 00:12:35.794 "data_offset": 0, 00:12:35.794 "data_size": 0 00:12:35.794 }, 00:12:35.794 { 00:12:35.794 "name": "BaseBdev3", 00:12:35.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.794 "is_configured": false, 00:12:35.794 "data_offset": 0, 00:12:35.794 "data_size": 0 00:12:35.794 } 00:12:35.794 ] 00:12:35.794 }' 00:12:35.794 07:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.794 07:48:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.365 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:36.626 [2024-07-15 07:48:21.182809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:36.626 BaseBdev2 00:12:36.626 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:36.626 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:36.626 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:36.626 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:36.626 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:36.626 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:36.626 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:36.887 [ 00:12:36.887 { 00:12:36.887 "name": "BaseBdev2", 00:12:36.887 "aliases": [ 00:12:36.887 "30186fe9-5e10-430d-9863-ef5999f648f7" 00:12:36.887 ], 00:12:36.887 "product_name": "Malloc disk", 00:12:36.887 "block_size": 512, 00:12:36.887 "num_blocks": 65536, 00:12:36.887 "uuid": "30186fe9-5e10-430d-9863-ef5999f648f7", 00:12:36.887 "assigned_rate_limits": { 00:12:36.887 "rw_ios_per_sec": 0, 00:12:36.887 "rw_mbytes_per_sec": 0, 00:12:36.887 "r_mbytes_per_sec": 0, 00:12:36.887 "w_mbytes_per_sec": 0 00:12:36.887 }, 00:12:36.887 "claimed": true, 00:12:36.887 "claim_type": "exclusive_write", 00:12:36.887 "zoned": false, 00:12:36.887 "supported_io_types": { 00:12:36.887 "read": true, 00:12:36.887 "write": true, 00:12:36.887 "unmap": true, 00:12:36.887 "flush": true, 00:12:36.887 "reset": true, 00:12:36.887 "nvme_admin": false, 00:12:36.887 "nvme_io": false, 00:12:36.887 "nvme_io_md": false, 00:12:36.887 "write_zeroes": true, 00:12:36.887 "zcopy": true, 00:12:36.887 "get_zone_info": false, 00:12:36.887 "zone_management": false, 00:12:36.887 "zone_append": false, 00:12:36.887 "compare": false, 00:12:36.887 "compare_and_write": false, 00:12:36.887 "abort": true, 00:12:36.887 "seek_hole": false, 00:12:36.887 "seek_data": false, 00:12:36.887 "copy": true, 00:12:36.887 "nvme_iov_md": false 00:12:36.887 }, 00:12:36.887 "memory_domains": [ 00:12:36.887 { 00:12:36.887 "dma_device_id": "system", 00:12:36.887 "dma_device_type": 1 00:12:36.887 }, 00:12:36.887 { 00:12:36.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.887 "dma_device_type": 2 00:12:36.887 } 00:12:36.887 ], 00:12:36.887 "driver_specific": {} 00:12:36.887 } 00:12:36.887 ] 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.887 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.148 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:37.148 "name": "Existed_Raid", 00:12:37.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.148 "strip_size_kb": 64, 00:12:37.148 "state": "configuring", 00:12:37.148 "raid_level": "raid0", 00:12:37.148 "superblock": false, 00:12:37.148 "num_base_bdevs": 3, 00:12:37.148 "num_base_bdevs_discovered": 2, 00:12:37.148 "num_base_bdevs_operational": 3, 00:12:37.148 "base_bdevs_list": [ 00:12:37.148 { 00:12:37.148 "name": "BaseBdev1", 00:12:37.148 "uuid": "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445", 00:12:37.148 "is_configured": true, 00:12:37.148 "data_offset": 0, 00:12:37.148 "data_size": 65536 00:12:37.148 }, 00:12:37.148 { 00:12:37.148 "name": "BaseBdev2", 00:12:37.148 "uuid": "30186fe9-5e10-430d-9863-ef5999f648f7", 00:12:37.148 "is_configured": true, 00:12:37.148 "data_offset": 0, 00:12:37.148 "data_size": 65536 00:12:37.148 }, 00:12:37.148 { 00:12:37.148 "name": "BaseBdev3", 00:12:37.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.148 "is_configured": false, 00:12:37.148 "data_offset": 0, 00:12:37.148 "data_size": 0 00:12:37.148 } 00:12:37.148 ] 00:12:37.148 }' 00:12:37.148 07:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:37.148 07:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.719 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:37.980 [2024-07-15 07:48:22.478850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:37.980 [2024-07-15 07:48:22.478875] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12d1e90 00:12:37.980 [2024-07-15 07:48:22.478880] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:37.980 [2024-07-15 07:48:22.479023] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12d1b60 00:12:37.980 [2024-07-15 07:48:22.479119] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12d1e90 00:12:37.980 [2024-07-15 07:48:22.479125] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12d1e90 00:12:37.980 [2024-07-15 07:48:22.479240] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:37.980 BaseBdev3 00:12:37.980 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:37.980 07:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:37.980 07:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:37.980 07:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:37.980 07:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:37.980 07:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:37.980 07:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:37.980 07:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:38.241 [ 00:12:38.241 { 00:12:38.241 "name": "BaseBdev3", 00:12:38.241 "aliases": [ 00:12:38.241 "8ac47b1c-72b2-46a0-ae86-b9756293b6b7" 00:12:38.241 ], 00:12:38.241 "product_name": "Malloc disk", 00:12:38.241 "block_size": 512, 00:12:38.241 "num_blocks": 65536, 00:12:38.241 "uuid": "8ac47b1c-72b2-46a0-ae86-b9756293b6b7", 00:12:38.241 "assigned_rate_limits": { 00:12:38.241 "rw_ios_per_sec": 0, 00:12:38.241 "rw_mbytes_per_sec": 0, 00:12:38.241 "r_mbytes_per_sec": 0, 00:12:38.241 "w_mbytes_per_sec": 0 00:12:38.241 }, 00:12:38.241 "claimed": true, 00:12:38.241 "claim_type": "exclusive_write", 00:12:38.241 "zoned": false, 00:12:38.241 "supported_io_types": { 00:12:38.241 "read": true, 00:12:38.241 "write": true, 00:12:38.241 "unmap": true, 00:12:38.241 "flush": true, 00:12:38.241 "reset": true, 00:12:38.241 "nvme_admin": false, 00:12:38.241 "nvme_io": false, 00:12:38.241 "nvme_io_md": false, 00:12:38.241 "write_zeroes": true, 00:12:38.241 "zcopy": true, 00:12:38.241 "get_zone_info": false, 00:12:38.241 "zone_management": false, 00:12:38.241 "zone_append": false, 00:12:38.241 "compare": false, 00:12:38.241 "compare_and_write": false, 00:12:38.241 "abort": true, 00:12:38.241 "seek_hole": false, 00:12:38.241 "seek_data": false, 00:12:38.241 "copy": true, 00:12:38.241 "nvme_iov_md": false 00:12:38.241 }, 00:12:38.241 "memory_domains": [ 00:12:38.241 { 00:12:38.241 "dma_device_id": "system", 00:12:38.241 "dma_device_type": 1 00:12:38.241 }, 00:12:38.241 { 00:12:38.241 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.241 "dma_device_type": 2 00:12:38.241 } 00:12:38.241 ], 00:12:38.241 "driver_specific": {} 00:12:38.241 } 00:12:38.241 ] 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.241 07:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.503 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.503 "name": "Existed_Raid", 00:12:38.503 "uuid": "613b6d4b-04b5-4d34-8eaa-2d2e1e3fb428", 00:12:38.503 "strip_size_kb": 64, 00:12:38.503 "state": "online", 00:12:38.503 "raid_level": "raid0", 00:12:38.503 "superblock": false, 00:12:38.503 "num_base_bdevs": 3, 00:12:38.503 "num_base_bdevs_discovered": 3, 00:12:38.503 "num_base_bdevs_operational": 3, 00:12:38.503 "base_bdevs_list": [ 00:12:38.503 { 00:12:38.503 "name": "BaseBdev1", 00:12:38.503 "uuid": "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445", 00:12:38.503 "is_configured": true, 00:12:38.503 "data_offset": 0, 00:12:38.503 "data_size": 65536 00:12:38.503 }, 00:12:38.503 { 00:12:38.503 "name": "BaseBdev2", 00:12:38.503 "uuid": "30186fe9-5e10-430d-9863-ef5999f648f7", 00:12:38.503 "is_configured": true, 00:12:38.503 "data_offset": 0, 00:12:38.503 "data_size": 65536 00:12:38.503 }, 00:12:38.503 { 00:12:38.503 "name": "BaseBdev3", 00:12:38.503 "uuid": "8ac47b1c-72b2-46a0-ae86-b9756293b6b7", 00:12:38.503 "is_configured": true, 00:12:38.503 "data_offset": 0, 00:12:38.503 "data_size": 65536 00:12:38.503 } 00:12:38.503 ] 00:12:38.503 }' 00:12:38.503 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.503 07:48:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:39.072 [2024-07-15 07:48:23.802453] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:39.072 "name": "Existed_Raid", 00:12:39.072 "aliases": [ 00:12:39.072 "613b6d4b-04b5-4d34-8eaa-2d2e1e3fb428" 00:12:39.072 ], 00:12:39.072 "product_name": "Raid Volume", 00:12:39.072 "block_size": 512, 00:12:39.072 "num_blocks": 196608, 00:12:39.072 "uuid": "613b6d4b-04b5-4d34-8eaa-2d2e1e3fb428", 00:12:39.072 "assigned_rate_limits": { 00:12:39.072 "rw_ios_per_sec": 0, 00:12:39.072 "rw_mbytes_per_sec": 0, 00:12:39.072 "r_mbytes_per_sec": 0, 00:12:39.072 "w_mbytes_per_sec": 0 00:12:39.072 }, 00:12:39.072 "claimed": false, 00:12:39.072 "zoned": false, 00:12:39.072 "supported_io_types": { 00:12:39.072 "read": true, 00:12:39.072 "write": true, 00:12:39.072 "unmap": true, 00:12:39.072 "flush": true, 00:12:39.072 "reset": true, 00:12:39.072 "nvme_admin": false, 00:12:39.072 "nvme_io": false, 00:12:39.072 "nvme_io_md": false, 00:12:39.072 "write_zeroes": true, 00:12:39.072 "zcopy": false, 00:12:39.072 "get_zone_info": false, 00:12:39.072 "zone_management": false, 00:12:39.072 "zone_append": false, 00:12:39.072 "compare": false, 00:12:39.072 "compare_and_write": false, 00:12:39.072 "abort": false, 00:12:39.072 "seek_hole": false, 00:12:39.072 "seek_data": false, 00:12:39.072 "copy": false, 00:12:39.072 "nvme_iov_md": false 00:12:39.072 }, 00:12:39.072 "memory_domains": [ 00:12:39.072 { 00:12:39.072 "dma_device_id": "system", 00:12:39.072 "dma_device_type": 1 00:12:39.072 }, 00:12:39.072 { 00:12:39.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.072 "dma_device_type": 2 00:12:39.072 }, 00:12:39.072 { 00:12:39.072 "dma_device_id": "system", 00:12:39.072 "dma_device_type": 1 00:12:39.072 }, 00:12:39.072 { 00:12:39.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.072 "dma_device_type": 2 00:12:39.072 }, 00:12:39.072 { 00:12:39.072 "dma_device_id": "system", 00:12:39.072 "dma_device_type": 1 00:12:39.072 }, 00:12:39.072 { 00:12:39.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.072 "dma_device_type": 2 00:12:39.072 } 00:12:39.072 ], 00:12:39.072 "driver_specific": { 00:12:39.072 "raid": { 00:12:39.072 "uuid": "613b6d4b-04b5-4d34-8eaa-2d2e1e3fb428", 00:12:39.072 "strip_size_kb": 64, 00:12:39.072 "state": "online", 00:12:39.072 "raid_level": "raid0", 00:12:39.072 "superblock": false, 00:12:39.072 "num_base_bdevs": 3, 00:12:39.072 "num_base_bdevs_discovered": 3, 00:12:39.072 "num_base_bdevs_operational": 3, 00:12:39.072 "base_bdevs_list": [ 00:12:39.072 { 00:12:39.072 "name": "BaseBdev1", 00:12:39.072 "uuid": "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445", 00:12:39.072 "is_configured": true, 00:12:39.072 "data_offset": 0, 00:12:39.072 "data_size": 65536 00:12:39.072 }, 00:12:39.072 { 00:12:39.072 "name": "BaseBdev2", 00:12:39.072 "uuid": "30186fe9-5e10-430d-9863-ef5999f648f7", 00:12:39.072 "is_configured": true, 00:12:39.072 "data_offset": 0, 00:12:39.072 "data_size": 65536 00:12:39.072 }, 00:12:39.072 { 00:12:39.072 "name": "BaseBdev3", 00:12:39.072 "uuid": "8ac47b1c-72b2-46a0-ae86-b9756293b6b7", 00:12:39.072 "is_configured": true, 00:12:39.072 "data_offset": 0, 00:12:39.072 "data_size": 65536 00:12:39.072 } 00:12:39.072 ] 00:12:39.072 } 00:12:39.072 } 00:12:39.072 }' 00:12:39.072 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:39.332 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:39.332 BaseBdev2 00:12:39.332 BaseBdev3' 00:12:39.332 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:39.332 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:39.332 07:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:39.332 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:39.332 "name": "BaseBdev1", 00:12:39.332 "aliases": [ 00:12:39.332 "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445" 00:12:39.332 ], 00:12:39.332 "product_name": "Malloc disk", 00:12:39.332 "block_size": 512, 00:12:39.332 "num_blocks": 65536, 00:12:39.332 "uuid": "3d3c6a20-1a3d-4b90-bd2d-6828f45e5445", 00:12:39.332 "assigned_rate_limits": { 00:12:39.332 "rw_ios_per_sec": 0, 00:12:39.332 "rw_mbytes_per_sec": 0, 00:12:39.332 "r_mbytes_per_sec": 0, 00:12:39.332 "w_mbytes_per_sec": 0 00:12:39.332 }, 00:12:39.332 "claimed": true, 00:12:39.332 "claim_type": "exclusive_write", 00:12:39.332 "zoned": false, 00:12:39.332 "supported_io_types": { 00:12:39.332 "read": true, 00:12:39.332 "write": true, 00:12:39.332 "unmap": true, 00:12:39.332 "flush": true, 00:12:39.332 "reset": true, 00:12:39.332 "nvme_admin": false, 00:12:39.332 "nvme_io": false, 00:12:39.332 "nvme_io_md": false, 00:12:39.332 "write_zeroes": true, 00:12:39.332 "zcopy": true, 00:12:39.332 "get_zone_info": false, 00:12:39.332 "zone_management": false, 00:12:39.332 "zone_append": false, 00:12:39.332 "compare": false, 00:12:39.332 "compare_and_write": false, 00:12:39.332 "abort": true, 00:12:39.332 "seek_hole": false, 00:12:39.332 "seek_data": false, 00:12:39.332 "copy": true, 00:12:39.332 "nvme_iov_md": false 00:12:39.332 }, 00:12:39.332 "memory_domains": [ 00:12:39.332 { 00:12:39.332 "dma_device_id": "system", 00:12:39.332 "dma_device_type": 1 00:12:39.332 }, 00:12:39.332 { 00:12:39.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.332 "dma_device_type": 2 00:12:39.332 } 00:12:39.332 ], 00:12:39.332 "driver_specific": {} 00:12:39.332 }' 00:12:39.332 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:39.592 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.852 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.852 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:39.852 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:39.852 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:39.852 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:39.852 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:39.852 "name": "BaseBdev2", 00:12:39.852 "aliases": [ 00:12:39.852 "30186fe9-5e10-430d-9863-ef5999f648f7" 00:12:39.852 ], 00:12:39.852 "product_name": "Malloc disk", 00:12:39.852 "block_size": 512, 00:12:39.852 "num_blocks": 65536, 00:12:39.852 "uuid": "30186fe9-5e10-430d-9863-ef5999f648f7", 00:12:39.852 "assigned_rate_limits": { 00:12:39.852 "rw_ios_per_sec": 0, 00:12:39.852 "rw_mbytes_per_sec": 0, 00:12:39.852 "r_mbytes_per_sec": 0, 00:12:39.852 "w_mbytes_per_sec": 0 00:12:39.852 }, 00:12:39.852 "claimed": true, 00:12:39.852 "claim_type": "exclusive_write", 00:12:39.852 "zoned": false, 00:12:39.852 "supported_io_types": { 00:12:39.852 "read": true, 00:12:39.852 "write": true, 00:12:39.852 "unmap": true, 00:12:39.852 "flush": true, 00:12:39.852 "reset": true, 00:12:39.852 "nvme_admin": false, 00:12:39.852 "nvme_io": false, 00:12:39.852 "nvme_io_md": false, 00:12:39.852 "write_zeroes": true, 00:12:39.852 "zcopy": true, 00:12:39.852 "get_zone_info": false, 00:12:39.852 "zone_management": false, 00:12:39.852 "zone_append": false, 00:12:39.852 "compare": false, 00:12:39.852 "compare_and_write": false, 00:12:39.852 "abort": true, 00:12:39.852 "seek_hole": false, 00:12:39.852 "seek_data": false, 00:12:39.852 "copy": true, 00:12:39.852 "nvme_iov_md": false 00:12:39.852 }, 00:12:39.852 "memory_domains": [ 00:12:39.852 { 00:12:39.852 "dma_device_id": "system", 00:12:39.852 "dma_device_type": 1 00:12:39.852 }, 00:12:39.852 { 00:12:39.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.852 "dma_device_type": 2 00:12:39.852 } 00:12:39.852 ], 00:12:39.852 "driver_specific": {} 00:12:39.852 }' 00:12:39.852 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:40.112 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:40.372 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:40.372 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:40.372 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:40.372 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:40.372 07:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:40.372 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:40.372 "name": "BaseBdev3", 00:12:40.372 "aliases": [ 00:12:40.372 "8ac47b1c-72b2-46a0-ae86-b9756293b6b7" 00:12:40.372 ], 00:12:40.372 "product_name": "Malloc disk", 00:12:40.372 "block_size": 512, 00:12:40.372 "num_blocks": 65536, 00:12:40.372 "uuid": "8ac47b1c-72b2-46a0-ae86-b9756293b6b7", 00:12:40.372 "assigned_rate_limits": { 00:12:40.372 "rw_ios_per_sec": 0, 00:12:40.372 "rw_mbytes_per_sec": 0, 00:12:40.372 "r_mbytes_per_sec": 0, 00:12:40.372 "w_mbytes_per_sec": 0 00:12:40.372 }, 00:12:40.372 "claimed": true, 00:12:40.372 "claim_type": "exclusive_write", 00:12:40.372 "zoned": false, 00:12:40.372 "supported_io_types": { 00:12:40.372 "read": true, 00:12:40.372 "write": true, 00:12:40.372 "unmap": true, 00:12:40.372 "flush": true, 00:12:40.372 "reset": true, 00:12:40.372 "nvme_admin": false, 00:12:40.372 "nvme_io": false, 00:12:40.372 "nvme_io_md": false, 00:12:40.372 "write_zeroes": true, 00:12:40.372 "zcopy": true, 00:12:40.372 "get_zone_info": false, 00:12:40.372 "zone_management": false, 00:12:40.372 "zone_append": false, 00:12:40.372 "compare": false, 00:12:40.372 "compare_and_write": false, 00:12:40.372 "abort": true, 00:12:40.372 "seek_hole": false, 00:12:40.372 "seek_data": false, 00:12:40.372 "copy": true, 00:12:40.372 "nvme_iov_md": false 00:12:40.372 }, 00:12:40.372 "memory_domains": [ 00:12:40.372 { 00:12:40.372 "dma_device_id": "system", 00:12:40.372 "dma_device_type": 1 00:12:40.372 }, 00:12:40.372 { 00:12:40.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.372 "dma_device_type": 2 00:12:40.372 } 00:12:40.372 ], 00:12:40.372 "driver_specific": {} 00:12:40.372 }' 00:12:40.372 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:40.632 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:40.891 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:40.891 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:40.891 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:41.153 [2024-07-15 07:48:25.650946] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:41.153 [2024-07-15 07:48:25.650963] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:41.153 [2024-07-15 07:48:25.650992] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.153 "name": "Existed_Raid", 00:12:41.153 "uuid": "613b6d4b-04b5-4d34-8eaa-2d2e1e3fb428", 00:12:41.153 "strip_size_kb": 64, 00:12:41.153 "state": "offline", 00:12:41.153 "raid_level": "raid0", 00:12:41.153 "superblock": false, 00:12:41.153 "num_base_bdevs": 3, 00:12:41.153 "num_base_bdevs_discovered": 2, 00:12:41.153 "num_base_bdevs_operational": 2, 00:12:41.153 "base_bdevs_list": [ 00:12:41.153 { 00:12:41.153 "name": null, 00:12:41.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.153 "is_configured": false, 00:12:41.153 "data_offset": 0, 00:12:41.153 "data_size": 65536 00:12:41.153 }, 00:12:41.153 { 00:12:41.153 "name": "BaseBdev2", 00:12:41.153 "uuid": "30186fe9-5e10-430d-9863-ef5999f648f7", 00:12:41.153 "is_configured": true, 00:12:41.153 "data_offset": 0, 00:12:41.153 "data_size": 65536 00:12:41.153 }, 00:12:41.153 { 00:12:41.153 "name": "BaseBdev3", 00:12:41.153 "uuid": "8ac47b1c-72b2-46a0-ae86-b9756293b6b7", 00:12:41.153 "is_configured": true, 00:12:41.153 "data_offset": 0, 00:12:41.153 "data_size": 65536 00:12:41.153 } 00:12:41.153 ] 00:12:41.153 }' 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.153 07:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.723 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:41.723 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:41.723 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.723 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:41.983 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:41.983 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:41.983 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:42.243 [2024-07-15 07:48:26.773846] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:42.243 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:42.243 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:42.243 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.243 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:42.504 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:42.504 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:42.504 07:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:42.504 [2024-07-15 07:48:27.176499] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:42.504 [2024-07-15 07:48:27.176528] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d1e90 name Existed_Raid, state offline 00:12:42.504 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:42.504 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:42.504 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.504 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:42.764 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:42.764 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:42.764 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:42.764 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:42.764 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:42.764 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:43.024 BaseBdev2 00:12:43.024 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:43.024 07:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:43.024 07:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:43.024 07:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:43.024 07:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:43.024 07:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:43.024 07:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:43.284 07:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:43.284 [ 00:12:43.284 { 00:12:43.284 "name": "BaseBdev2", 00:12:43.284 "aliases": [ 00:12:43.284 "9d82a0fe-ca5b-416f-a65b-08eb10e89024" 00:12:43.284 ], 00:12:43.284 "product_name": "Malloc disk", 00:12:43.284 "block_size": 512, 00:12:43.284 "num_blocks": 65536, 00:12:43.284 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:43.284 "assigned_rate_limits": { 00:12:43.284 "rw_ios_per_sec": 0, 00:12:43.284 "rw_mbytes_per_sec": 0, 00:12:43.284 "r_mbytes_per_sec": 0, 00:12:43.284 "w_mbytes_per_sec": 0 00:12:43.284 }, 00:12:43.284 "claimed": false, 00:12:43.284 "zoned": false, 00:12:43.284 "supported_io_types": { 00:12:43.284 "read": true, 00:12:43.284 "write": true, 00:12:43.284 "unmap": true, 00:12:43.284 "flush": true, 00:12:43.284 "reset": true, 00:12:43.284 "nvme_admin": false, 00:12:43.284 "nvme_io": false, 00:12:43.284 "nvme_io_md": false, 00:12:43.284 "write_zeroes": true, 00:12:43.284 "zcopy": true, 00:12:43.284 "get_zone_info": false, 00:12:43.284 "zone_management": false, 00:12:43.284 "zone_append": false, 00:12:43.284 "compare": false, 00:12:43.284 "compare_and_write": false, 00:12:43.284 "abort": true, 00:12:43.284 "seek_hole": false, 00:12:43.284 "seek_data": false, 00:12:43.284 "copy": true, 00:12:43.284 "nvme_iov_md": false 00:12:43.284 }, 00:12:43.284 "memory_domains": [ 00:12:43.284 { 00:12:43.284 "dma_device_id": "system", 00:12:43.284 "dma_device_type": 1 00:12:43.284 }, 00:12:43.284 { 00:12:43.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.284 "dma_device_type": 2 00:12:43.284 } 00:12:43.284 ], 00:12:43.284 "driver_specific": {} 00:12:43.284 } 00:12:43.284 ] 00:12:43.284 07:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:43.284 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:43.284 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:43.284 07:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:43.545 BaseBdev3 00:12:43.545 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:43.545 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:12:43.545 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:43.545 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:43.545 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:43.545 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:43.545 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:43.805 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:43.805 [ 00:12:43.805 { 00:12:43.805 "name": "BaseBdev3", 00:12:43.805 "aliases": [ 00:12:43.805 "428711be-a42d-47ba-bee9-e63c747bda8c" 00:12:43.805 ], 00:12:43.805 "product_name": "Malloc disk", 00:12:43.805 "block_size": 512, 00:12:43.806 "num_blocks": 65536, 00:12:43.806 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:43.806 "assigned_rate_limits": { 00:12:43.806 "rw_ios_per_sec": 0, 00:12:43.806 "rw_mbytes_per_sec": 0, 00:12:43.806 "r_mbytes_per_sec": 0, 00:12:43.806 "w_mbytes_per_sec": 0 00:12:43.806 }, 00:12:43.806 "claimed": false, 00:12:43.806 "zoned": false, 00:12:43.806 "supported_io_types": { 00:12:43.806 "read": true, 00:12:43.806 "write": true, 00:12:43.806 "unmap": true, 00:12:43.806 "flush": true, 00:12:43.806 "reset": true, 00:12:43.806 "nvme_admin": false, 00:12:43.806 "nvme_io": false, 00:12:43.806 "nvme_io_md": false, 00:12:43.806 "write_zeroes": true, 00:12:43.806 "zcopy": true, 00:12:43.806 "get_zone_info": false, 00:12:43.806 "zone_management": false, 00:12:43.806 "zone_append": false, 00:12:43.806 "compare": false, 00:12:43.806 "compare_and_write": false, 00:12:43.806 "abort": true, 00:12:43.806 "seek_hole": false, 00:12:43.806 "seek_data": false, 00:12:43.806 "copy": true, 00:12:43.806 "nvme_iov_md": false 00:12:43.806 }, 00:12:43.806 "memory_domains": [ 00:12:43.806 { 00:12:43.806 "dma_device_id": "system", 00:12:43.806 "dma_device_type": 1 00:12:43.806 }, 00:12:43.806 { 00:12:43.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.806 "dma_device_type": 2 00:12:43.806 } 00:12:43.806 ], 00:12:43.806 "driver_specific": {} 00:12:43.806 } 00:12:43.806 ] 00:12:43.806 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:43.806 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:43.806 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:43.806 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:44.108 [2024-07-15 07:48:28.716014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:44.108 [2024-07-15 07:48:28.716045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:44.108 [2024-07-15 07:48:28.716057] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:44.108 [2024-07-15 07:48:28.717111] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.108 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:44.371 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.371 "name": "Existed_Raid", 00:12:44.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:44.371 "strip_size_kb": 64, 00:12:44.371 "state": "configuring", 00:12:44.371 "raid_level": "raid0", 00:12:44.371 "superblock": false, 00:12:44.371 "num_base_bdevs": 3, 00:12:44.371 "num_base_bdevs_discovered": 2, 00:12:44.371 "num_base_bdevs_operational": 3, 00:12:44.371 "base_bdevs_list": [ 00:12:44.371 { 00:12:44.371 "name": "BaseBdev1", 00:12:44.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:44.371 "is_configured": false, 00:12:44.371 "data_offset": 0, 00:12:44.371 "data_size": 0 00:12:44.371 }, 00:12:44.371 { 00:12:44.371 "name": "BaseBdev2", 00:12:44.371 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:44.371 "is_configured": true, 00:12:44.371 "data_offset": 0, 00:12:44.371 "data_size": 65536 00:12:44.371 }, 00:12:44.371 { 00:12:44.371 "name": "BaseBdev3", 00:12:44.371 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:44.371 "is_configured": true, 00:12:44.371 "data_offset": 0, 00:12:44.371 "data_size": 65536 00:12:44.371 } 00:12:44.371 ] 00:12:44.371 }' 00:12:44.371 07:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.371 07:48:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:44.940 [2024-07-15 07:48:29.650505] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.940 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:45.215 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.215 "name": "Existed_Raid", 00:12:45.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.215 "strip_size_kb": 64, 00:12:45.215 "state": "configuring", 00:12:45.215 "raid_level": "raid0", 00:12:45.215 "superblock": false, 00:12:45.215 "num_base_bdevs": 3, 00:12:45.215 "num_base_bdevs_discovered": 1, 00:12:45.215 "num_base_bdevs_operational": 3, 00:12:45.215 "base_bdevs_list": [ 00:12:45.215 { 00:12:45.215 "name": "BaseBdev1", 00:12:45.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.215 "is_configured": false, 00:12:45.215 "data_offset": 0, 00:12:45.215 "data_size": 0 00:12:45.215 }, 00:12:45.215 { 00:12:45.215 "name": null, 00:12:45.215 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:45.215 "is_configured": false, 00:12:45.215 "data_offset": 0, 00:12:45.215 "data_size": 65536 00:12:45.215 }, 00:12:45.215 { 00:12:45.215 "name": "BaseBdev3", 00:12:45.215 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:45.215 "is_configured": true, 00:12:45.215 "data_offset": 0, 00:12:45.215 "data_size": 65536 00:12:45.215 } 00:12:45.215 ] 00:12:45.215 }' 00:12:45.215 07:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.215 07:48:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:45.783 07:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.783 07:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:46.044 [2024-07-15 07:48:30.766187] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:46.044 BaseBdev1 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:46.044 07:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:46.304 07:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:46.563 [ 00:12:46.563 { 00:12:46.563 "name": "BaseBdev1", 00:12:46.563 "aliases": [ 00:12:46.563 "836cbf72-4a0a-4857-a5e5-360043b3143a" 00:12:46.563 ], 00:12:46.563 "product_name": "Malloc disk", 00:12:46.563 "block_size": 512, 00:12:46.563 "num_blocks": 65536, 00:12:46.563 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:46.563 "assigned_rate_limits": { 00:12:46.563 "rw_ios_per_sec": 0, 00:12:46.563 "rw_mbytes_per_sec": 0, 00:12:46.563 "r_mbytes_per_sec": 0, 00:12:46.563 "w_mbytes_per_sec": 0 00:12:46.563 }, 00:12:46.563 "claimed": true, 00:12:46.563 "claim_type": "exclusive_write", 00:12:46.563 "zoned": false, 00:12:46.563 "supported_io_types": { 00:12:46.563 "read": true, 00:12:46.563 "write": true, 00:12:46.563 "unmap": true, 00:12:46.563 "flush": true, 00:12:46.563 "reset": true, 00:12:46.563 "nvme_admin": false, 00:12:46.563 "nvme_io": false, 00:12:46.563 "nvme_io_md": false, 00:12:46.563 "write_zeroes": true, 00:12:46.563 "zcopy": true, 00:12:46.563 "get_zone_info": false, 00:12:46.563 "zone_management": false, 00:12:46.563 "zone_append": false, 00:12:46.563 "compare": false, 00:12:46.563 "compare_and_write": false, 00:12:46.563 "abort": true, 00:12:46.563 "seek_hole": false, 00:12:46.563 "seek_data": false, 00:12:46.563 "copy": true, 00:12:46.563 "nvme_iov_md": false 00:12:46.563 }, 00:12:46.563 "memory_domains": [ 00:12:46.563 { 00:12:46.563 "dma_device_id": "system", 00:12:46.563 "dma_device_type": 1 00:12:46.563 }, 00:12:46.563 { 00:12:46.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.563 "dma_device_type": 2 00:12:46.563 } 00:12:46.563 ], 00:12:46.563 "driver_specific": {} 00:12:46.563 } 00:12:46.563 ] 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.563 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:46.823 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.823 "name": "Existed_Raid", 00:12:46.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:46.823 "strip_size_kb": 64, 00:12:46.823 "state": "configuring", 00:12:46.823 "raid_level": "raid0", 00:12:46.823 "superblock": false, 00:12:46.823 "num_base_bdevs": 3, 00:12:46.823 "num_base_bdevs_discovered": 2, 00:12:46.823 "num_base_bdevs_operational": 3, 00:12:46.823 "base_bdevs_list": [ 00:12:46.823 { 00:12:46.823 "name": "BaseBdev1", 00:12:46.823 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:46.823 "is_configured": true, 00:12:46.823 "data_offset": 0, 00:12:46.823 "data_size": 65536 00:12:46.823 }, 00:12:46.823 { 00:12:46.823 "name": null, 00:12:46.823 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:46.823 "is_configured": false, 00:12:46.823 "data_offset": 0, 00:12:46.823 "data_size": 65536 00:12:46.823 }, 00:12:46.823 { 00:12:46.823 "name": "BaseBdev3", 00:12:46.823 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:46.823 "is_configured": true, 00:12:46.823 "data_offset": 0, 00:12:46.823 "data_size": 65536 00:12:46.823 } 00:12:46.823 ] 00:12:46.823 }' 00:12:46.823 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.823 07:48:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.394 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.394 07:48:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:47.394 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:47.394 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:47.654 [2024-07-15 07:48:32.241931] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.654 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:47.915 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.915 "name": "Existed_Raid", 00:12:47.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.915 "strip_size_kb": 64, 00:12:47.915 "state": "configuring", 00:12:47.915 "raid_level": "raid0", 00:12:47.915 "superblock": false, 00:12:47.915 "num_base_bdevs": 3, 00:12:47.915 "num_base_bdevs_discovered": 1, 00:12:47.915 "num_base_bdevs_operational": 3, 00:12:47.915 "base_bdevs_list": [ 00:12:47.915 { 00:12:47.915 "name": "BaseBdev1", 00:12:47.915 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:47.915 "is_configured": true, 00:12:47.915 "data_offset": 0, 00:12:47.915 "data_size": 65536 00:12:47.915 }, 00:12:47.915 { 00:12:47.915 "name": null, 00:12:47.915 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:47.915 "is_configured": false, 00:12:47.915 "data_offset": 0, 00:12:47.915 "data_size": 65536 00:12:47.915 }, 00:12:47.915 { 00:12:47.915 "name": null, 00:12:47.915 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:47.915 "is_configured": false, 00:12:47.915 "data_offset": 0, 00:12:47.915 "data_size": 65536 00:12:47.915 } 00:12:47.915 ] 00:12:47.915 }' 00:12:47.915 07:48:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.915 07:48:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.486 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.486 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:48.486 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:48.486 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:48.745 [2024-07-15 07:48:33.360836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.745 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.004 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.005 "name": "Existed_Raid", 00:12:49.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.005 "strip_size_kb": 64, 00:12:49.005 "state": "configuring", 00:12:49.005 "raid_level": "raid0", 00:12:49.005 "superblock": false, 00:12:49.005 "num_base_bdevs": 3, 00:12:49.005 "num_base_bdevs_discovered": 2, 00:12:49.005 "num_base_bdevs_operational": 3, 00:12:49.005 "base_bdevs_list": [ 00:12:49.005 { 00:12:49.005 "name": "BaseBdev1", 00:12:49.005 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:49.005 "is_configured": true, 00:12:49.005 "data_offset": 0, 00:12:49.005 "data_size": 65536 00:12:49.005 }, 00:12:49.005 { 00:12:49.005 "name": null, 00:12:49.005 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:49.005 "is_configured": false, 00:12:49.005 "data_offset": 0, 00:12:49.005 "data_size": 65536 00:12:49.005 }, 00:12:49.005 { 00:12:49.005 "name": "BaseBdev3", 00:12:49.005 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:49.005 "is_configured": true, 00:12:49.005 "data_offset": 0, 00:12:49.005 "data_size": 65536 00:12:49.005 } 00:12:49.005 ] 00:12:49.005 }' 00:12:49.005 07:48:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.005 07:48:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.575 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.575 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:49.575 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:49.575 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:49.836 [2024-07-15 07:48:34.451600] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.836 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:50.096 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:50.096 "name": "Existed_Raid", 00:12:50.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.096 "strip_size_kb": 64, 00:12:50.096 "state": "configuring", 00:12:50.096 "raid_level": "raid0", 00:12:50.096 "superblock": false, 00:12:50.096 "num_base_bdevs": 3, 00:12:50.096 "num_base_bdevs_discovered": 1, 00:12:50.096 "num_base_bdevs_operational": 3, 00:12:50.096 "base_bdevs_list": [ 00:12:50.096 { 00:12:50.096 "name": null, 00:12:50.096 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:50.096 "is_configured": false, 00:12:50.096 "data_offset": 0, 00:12:50.096 "data_size": 65536 00:12:50.096 }, 00:12:50.096 { 00:12:50.096 "name": null, 00:12:50.096 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:50.096 "is_configured": false, 00:12:50.096 "data_offset": 0, 00:12:50.096 "data_size": 65536 00:12:50.096 }, 00:12:50.096 { 00:12:50.096 "name": "BaseBdev3", 00:12:50.096 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:50.096 "is_configured": true, 00:12:50.096 "data_offset": 0, 00:12:50.096 "data_size": 65536 00:12:50.096 } 00:12:50.096 ] 00:12:50.096 }' 00:12:50.096 07:48:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:50.096 07:48:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.667 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.667 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:50.667 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:50.667 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:50.927 [2024-07-15 07:48:35.584262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.927 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.187 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.187 "name": "Existed_Raid", 00:12:51.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.187 "strip_size_kb": 64, 00:12:51.187 "state": "configuring", 00:12:51.187 "raid_level": "raid0", 00:12:51.187 "superblock": false, 00:12:51.187 "num_base_bdevs": 3, 00:12:51.187 "num_base_bdevs_discovered": 2, 00:12:51.187 "num_base_bdevs_operational": 3, 00:12:51.187 "base_bdevs_list": [ 00:12:51.187 { 00:12:51.187 "name": null, 00:12:51.187 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:51.187 "is_configured": false, 00:12:51.187 "data_offset": 0, 00:12:51.187 "data_size": 65536 00:12:51.187 }, 00:12:51.187 { 00:12:51.187 "name": "BaseBdev2", 00:12:51.187 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:51.187 "is_configured": true, 00:12:51.187 "data_offset": 0, 00:12:51.187 "data_size": 65536 00:12:51.187 }, 00:12:51.187 { 00:12:51.187 "name": "BaseBdev3", 00:12:51.187 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:51.187 "is_configured": true, 00:12:51.187 "data_offset": 0, 00:12:51.187 "data_size": 65536 00:12:51.187 } 00:12:51.187 ] 00:12:51.187 }' 00:12:51.187 07:48:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.187 07:48:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:51.758 07:48:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.758 07:48:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:51.758 07:48:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:51.758 07:48:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.758 07:48:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:52.019 07:48:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 836cbf72-4a0a-4857-a5e5-360043b3143a 00:12:52.280 [2024-07-15 07:48:36.876446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:52.280 [2024-07-15 07:48:36.876473] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12d2780 00:12:52.280 [2024-07-15 07:48:36.876477] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:52.280 [2024-07-15 07:48:36.876622] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1484190 00:12:52.280 [2024-07-15 07:48:36.876717] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12d2780 00:12:52.280 [2024-07-15 07:48:36.876723] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12d2780 00:12:52.280 [2024-07-15 07:48:36.876841] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:52.280 NewBaseBdev 00:12:52.280 07:48:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:52.280 07:48:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:12:52.280 07:48:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:52.280 07:48:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:52.280 07:48:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:52.280 07:48:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:52.280 07:48:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:52.541 [ 00:12:52.541 { 00:12:52.541 "name": "NewBaseBdev", 00:12:52.541 "aliases": [ 00:12:52.541 "836cbf72-4a0a-4857-a5e5-360043b3143a" 00:12:52.541 ], 00:12:52.541 "product_name": "Malloc disk", 00:12:52.541 "block_size": 512, 00:12:52.541 "num_blocks": 65536, 00:12:52.541 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:52.541 "assigned_rate_limits": { 00:12:52.541 "rw_ios_per_sec": 0, 00:12:52.541 "rw_mbytes_per_sec": 0, 00:12:52.541 "r_mbytes_per_sec": 0, 00:12:52.541 "w_mbytes_per_sec": 0 00:12:52.541 }, 00:12:52.541 "claimed": true, 00:12:52.541 "claim_type": "exclusive_write", 00:12:52.541 "zoned": false, 00:12:52.541 "supported_io_types": { 00:12:52.541 "read": true, 00:12:52.541 "write": true, 00:12:52.541 "unmap": true, 00:12:52.541 "flush": true, 00:12:52.541 "reset": true, 00:12:52.541 "nvme_admin": false, 00:12:52.541 "nvme_io": false, 00:12:52.541 "nvme_io_md": false, 00:12:52.541 "write_zeroes": true, 00:12:52.541 "zcopy": true, 00:12:52.541 "get_zone_info": false, 00:12:52.541 "zone_management": false, 00:12:52.541 "zone_append": false, 00:12:52.541 "compare": false, 00:12:52.541 "compare_and_write": false, 00:12:52.541 "abort": true, 00:12:52.541 "seek_hole": false, 00:12:52.541 "seek_data": false, 00:12:52.541 "copy": true, 00:12:52.541 "nvme_iov_md": false 00:12:52.541 }, 00:12:52.541 "memory_domains": [ 00:12:52.541 { 00:12:52.541 "dma_device_id": "system", 00:12:52.541 "dma_device_type": 1 00:12:52.541 }, 00:12:52.541 { 00:12:52.541 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.541 "dma_device_type": 2 00:12:52.541 } 00:12:52.541 ], 00:12:52.541 "driver_specific": {} 00:12:52.541 } 00:12:52.541 ] 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.541 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:52.801 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.801 "name": "Existed_Raid", 00:12:52.801 "uuid": "25b109da-25c1-4570-9118-ed3007869d18", 00:12:52.801 "strip_size_kb": 64, 00:12:52.801 "state": "online", 00:12:52.801 "raid_level": "raid0", 00:12:52.801 "superblock": false, 00:12:52.801 "num_base_bdevs": 3, 00:12:52.801 "num_base_bdevs_discovered": 3, 00:12:52.801 "num_base_bdevs_operational": 3, 00:12:52.801 "base_bdevs_list": [ 00:12:52.801 { 00:12:52.801 "name": "NewBaseBdev", 00:12:52.801 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:52.801 "is_configured": true, 00:12:52.801 "data_offset": 0, 00:12:52.801 "data_size": 65536 00:12:52.801 }, 00:12:52.801 { 00:12:52.801 "name": "BaseBdev2", 00:12:52.801 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:52.801 "is_configured": true, 00:12:52.801 "data_offset": 0, 00:12:52.801 "data_size": 65536 00:12:52.801 }, 00:12:52.801 { 00:12:52.801 "name": "BaseBdev3", 00:12:52.801 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:52.801 "is_configured": true, 00:12:52.801 "data_offset": 0, 00:12:52.801 "data_size": 65536 00:12:52.801 } 00:12:52.801 ] 00:12:52.801 }' 00:12:52.801 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.801 07:48:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.372 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:53.372 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:53.372 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:53.372 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:53.372 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:53.372 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:53.372 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:53.372 07:48:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:53.632 [2024-07-15 07:48:38.175974] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:53.632 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:53.632 "name": "Existed_Raid", 00:12:53.632 "aliases": [ 00:12:53.632 "25b109da-25c1-4570-9118-ed3007869d18" 00:12:53.632 ], 00:12:53.632 "product_name": "Raid Volume", 00:12:53.632 "block_size": 512, 00:12:53.632 "num_blocks": 196608, 00:12:53.632 "uuid": "25b109da-25c1-4570-9118-ed3007869d18", 00:12:53.632 "assigned_rate_limits": { 00:12:53.632 "rw_ios_per_sec": 0, 00:12:53.632 "rw_mbytes_per_sec": 0, 00:12:53.632 "r_mbytes_per_sec": 0, 00:12:53.632 "w_mbytes_per_sec": 0 00:12:53.632 }, 00:12:53.632 "claimed": false, 00:12:53.632 "zoned": false, 00:12:53.632 "supported_io_types": { 00:12:53.632 "read": true, 00:12:53.632 "write": true, 00:12:53.632 "unmap": true, 00:12:53.632 "flush": true, 00:12:53.632 "reset": true, 00:12:53.632 "nvme_admin": false, 00:12:53.632 "nvme_io": false, 00:12:53.632 "nvme_io_md": false, 00:12:53.632 "write_zeroes": true, 00:12:53.632 "zcopy": false, 00:12:53.632 "get_zone_info": false, 00:12:53.632 "zone_management": false, 00:12:53.632 "zone_append": false, 00:12:53.632 "compare": false, 00:12:53.632 "compare_and_write": false, 00:12:53.632 "abort": false, 00:12:53.632 "seek_hole": false, 00:12:53.632 "seek_data": false, 00:12:53.632 "copy": false, 00:12:53.632 "nvme_iov_md": false 00:12:53.632 }, 00:12:53.632 "memory_domains": [ 00:12:53.632 { 00:12:53.632 "dma_device_id": "system", 00:12:53.632 "dma_device_type": 1 00:12:53.632 }, 00:12:53.632 { 00:12:53.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.632 "dma_device_type": 2 00:12:53.632 }, 00:12:53.632 { 00:12:53.632 "dma_device_id": "system", 00:12:53.632 "dma_device_type": 1 00:12:53.632 }, 00:12:53.632 { 00:12:53.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.632 "dma_device_type": 2 00:12:53.632 }, 00:12:53.632 { 00:12:53.632 "dma_device_id": "system", 00:12:53.632 "dma_device_type": 1 00:12:53.632 }, 00:12:53.632 { 00:12:53.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.632 "dma_device_type": 2 00:12:53.632 } 00:12:53.632 ], 00:12:53.632 "driver_specific": { 00:12:53.632 "raid": { 00:12:53.632 "uuid": "25b109da-25c1-4570-9118-ed3007869d18", 00:12:53.632 "strip_size_kb": 64, 00:12:53.632 "state": "online", 00:12:53.632 "raid_level": "raid0", 00:12:53.632 "superblock": false, 00:12:53.632 "num_base_bdevs": 3, 00:12:53.632 "num_base_bdevs_discovered": 3, 00:12:53.632 "num_base_bdevs_operational": 3, 00:12:53.632 "base_bdevs_list": [ 00:12:53.632 { 00:12:53.632 "name": "NewBaseBdev", 00:12:53.632 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:53.632 "is_configured": true, 00:12:53.632 "data_offset": 0, 00:12:53.632 "data_size": 65536 00:12:53.632 }, 00:12:53.632 { 00:12:53.632 "name": "BaseBdev2", 00:12:53.632 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:53.632 "is_configured": true, 00:12:53.632 "data_offset": 0, 00:12:53.632 "data_size": 65536 00:12:53.632 }, 00:12:53.632 { 00:12:53.632 "name": "BaseBdev3", 00:12:53.632 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:53.632 "is_configured": true, 00:12:53.632 "data_offset": 0, 00:12:53.632 "data_size": 65536 00:12:53.632 } 00:12:53.632 ] 00:12:53.632 } 00:12:53.632 } 00:12:53.632 }' 00:12:53.632 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:53.632 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:53.632 BaseBdev2 00:12:53.632 BaseBdev3' 00:12:53.632 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:53.632 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:53.632 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:53.892 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:53.892 "name": "NewBaseBdev", 00:12:53.892 "aliases": [ 00:12:53.892 "836cbf72-4a0a-4857-a5e5-360043b3143a" 00:12:53.892 ], 00:12:53.892 "product_name": "Malloc disk", 00:12:53.892 "block_size": 512, 00:12:53.892 "num_blocks": 65536, 00:12:53.892 "uuid": "836cbf72-4a0a-4857-a5e5-360043b3143a", 00:12:53.892 "assigned_rate_limits": { 00:12:53.892 "rw_ios_per_sec": 0, 00:12:53.892 "rw_mbytes_per_sec": 0, 00:12:53.892 "r_mbytes_per_sec": 0, 00:12:53.892 "w_mbytes_per_sec": 0 00:12:53.892 }, 00:12:53.892 "claimed": true, 00:12:53.892 "claim_type": "exclusive_write", 00:12:53.892 "zoned": false, 00:12:53.892 "supported_io_types": { 00:12:53.892 "read": true, 00:12:53.892 "write": true, 00:12:53.892 "unmap": true, 00:12:53.892 "flush": true, 00:12:53.892 "reset": true, 00:12:53.892 "nvme_admin": false, 00:12:53.892 "nvme_io": false, 00:12:53.892 "nvme_io_md": false, 00:12:53.892 "write_zeroes": true, 00:12:53.892 "zcopy": true, 00:12:53.892 "get_zone_info": false, 00:12:53.892 "zone_management": false, 00:12:53.892 "zone_append": false, 00:12:53.892 "compare": false, 00:12:53.892 "compare_and_write": false, 00:12:53.892 "abort": true, 00:12:53.892 "seek_hole": false, 00:12:53.892 "seek_data": false, 00:12:53.892 "copy": true, 00:12:53.892 "nvme_iov_md": false 00:12:53.892 }, 00:12:53.892 "memory_domains": [ 00:12:53.892 { 00:12:53.892 "dma_device_id": "system", 00:12:53.892 "dma_device_type": 1 00:12:53.893 }, 00:12:53.893 { 00:12:53.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.893 "dma_device_type": 2 00:12:53.893 } 00:12:53.893 ], 00:12:53.893 "driver_specific": {} 00:12:53.893 }' 00:12:53.893 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.893 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.893 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:53.893 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.893 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.893 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:53.893 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:54.152 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:54.152 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:54.152 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.152 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.152 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:54.152 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:54.152 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:54.152 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:54.412 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:54.412 "name": "BaseBdev2", 00:12:54.412 "aliases": [ 00:12:54.412 "9d82a0fe-ca5b-416f-a65b-08eb10e89024" 00:12:54.412 ], 00:12:54.412 "product_name": "Malloc disk", 00:12:54.412 "block_size": 512, 00:12:54.412 "num_blocks": 65536, 00:12:54.412 "uuid": "9d82a0fe-ca5b-416f-a65b-08eb10e89024", 00:12:54.412 "assigned_rate_limits": { 00:12:54.412 "rw_ios_per_sec": 0, 00:12:54.412 "rw_mbytes_per_sec": 0, 00:12:54.412 "r_mbytes_per_sec": 0, 00:12:54.412 "w_mbytes_per_sec": 0 00:12:54.412 }, 00:12:54.412 "claimed": true, 00:12:54.412 "claim_type": "exclusive_write", 00:12:54.412 "zoned": false, 00:12:54.412 "supported_io_types": { 00:12:54.412 "read": true, 00:12:54.412 "write": true, 00:12:54.412 "unmap": true, 00:12:54.412 "flush": true, 00:12:54.412 "reset": true, 00:12:54.412 "nvme_admin": false, 00:12:54.412 "nvme_io": false, 00:12:54.412 "nvme_io_md": false, 00:12:54.412 "write_zeroes": true, 00:12:54.412 "zcopy": true, 00:12:54.412 "get_zone_info": false, 00:12:54.412 "zone_management": false, 00:12:54.412 "zone_append": false, 00:12:54.412 "compare": false, 00:12:54.412 "compare_and_write": false, 00:12:54.412 "abort": true, 00:12:54.412 "seek_hole": false, 00:12:54.412 "seek_data": false, 00:12:54.412 "copy": true, 00:12:54.412 "nvme_iov_md": false 00:12:54.412 }, 00:12:54.412 "memory_domains": [ 00:12:54.412 { 00:12:54.412 "dma_device_id": "system", 00:12:54.412 "dma_device_type": 1 00:12:54.412 }, 00:12:54.412 { 00:12:54.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.412 "dma_device_type": 2 00:12:54.412 } 00:12:54.412 ], 00:12:54.412 "driver_specific": {} 00:12:54.412 }' 00:12:54.412 07:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:54.412 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:54.412 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:54.412 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:54.412 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:54.412 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:54.412 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:54.672 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:54.672 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:54.672 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.672 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.672 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:54.672 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:54.672 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:54.672 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:54.932 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:54.932 "name": "BaseBdev3", 00:12:54.932 "aliases": [ 00:12:54.932 "428711be-a42d-47ba-bee9-e63c747bda8c" 00:12:54.932 ], 00:12:54.932 "product_name": "Malloc disk", 00:12:54.932 "block_size": 512, 00:12:54.932 "num_blocks": 65536, 00:12:54.932 "uuid": "428711be-a42d-47ba-bee9-e63c747bda8c", 00:12:54.932 "assigned_rate_limits": { 00:12:54.932 "rw_ios_per_sec": 0, 00:12:54.932 "rw_mbytes_per_sec": 0, 00:12:54.932 "r_mbytes_per_sec": 0, 00:12:54.932 "w_mbytes_per_sec": 0 00:12:54.932 }, 00:12:54.932 "claimed": true, 00:12:54.932 "claim_type": "exclusive_write", 00:12:54.932 "zoned": false, 00:12:54.932 "supported_io_types": { 00:12:54.932 "read": true, 00:12:54.932 "write": true, 00:12:54.932 "unmap": true, 00:12:54.932 "flush": true, 00:12:54.932 "reset": true, 00:12:54.932 "nvme_admin": false, 00:12:54.932 "nvme_io": false, 00:12:54.932 "nvme_io_md": false, 00:12:54.932 "write_zeroes": true, 00:12:54.932 "zcopy": true, 00:12:54.932 "get_zone_info": false, 00:12:54.932 "zone_management": false, 00:12:54.932 "zone_append": false, 00:12:54.932 "compare": false, 00:12:54.932 "compare_and_write": false, 00:12:54.932 "abort": true, 00:12:54.932 "seek_hole": false, 00:12:54.932 "seek_data": false, 00:12:54.932 "copy": true, 00:12:54.932 "nvme_iov_md": false 00:12:54.932 }, 00:12:54.932 "memory_domains": [ 00:12:54.932 { 00:12:54.932 "dma_device_id": "system", 00:12:54.932 "dma_device_type": 1 00:12:54.932 }, 00:12:54.932 { 00:12:54.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.932 "dma_device_type": 2 00:12:54.932 } 00:12:54.932 ], 00:12:54.932 "driver_specific": {} 00:12:54.932 }' 00:12:54.932 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:54.932 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:54.932 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:54.932 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:54.932 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.192 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:55.192 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.192 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.192 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:55.192 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.192 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.192 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:55.192 07:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:55.453 [2024-07-15 07:48:40.072537] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:55.453 [2024-07-15 07:48:40.072558] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:55.453 [2024-07-15 07:48:40.072597] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:55.453 [2024-07-15 07:48:40.072632] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:55.453 [2024-07-15 07:48:40.072638] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d2780 name Existed_Raid, state offline 00:12:55.453 07:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1609616 00:12:55.453 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1609616 ']' 00:12:55.453 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1609616 00:12:55.453 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:55.453 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:55.453 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1609616 00:12:55.453 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:55.453 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:55.454 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1609616' 00:12:55.454 killing process with pid 1609616 00:12:55.454 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1609616 00:12:55.454 [2024-07-15 07:48:40.139591] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:55.454 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1609616 00:12:55.454 [2024-07-15 07:48:40.154027] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:55.715 00:12:55.715 real 0m23.893s 00:12:55.715 user 0m44.791s 00:12:55.715 sys 0m3.503s 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.715 ************************************ 00:12:55.715 END TEST raid_state_function_test 00:12:55.715 ************************************ 00:12:55.715 07:48:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:55.715 07:48:40 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:12:55.715 07:48:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:55.715 07:48:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:55.715 07:48:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:55.715 ************************************ 00:12:55.715 START TEST raid_state_function_test_sb 00:12:55.715 ************************************ 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1614193 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1614193' 00:12:55.715 Process raid pid: 1614193 00:12:55.715 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1614193 /var/tmp/spdk-raid.sock 00:12:55.716 07:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:55.716 07:48:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1614193 ']' 00:12:55.716 07:48:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:55.716 07:48:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:55.716 07:48:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:55.716 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:55.716 07:48:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:55.716 07:48:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:55.716 [2024-07-15 07:48:40.419549] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:12:55.716 [2024-07-15 07:48:40.419611] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:55.976 [2024-07-15 07:48:40.509489] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:55.976 [2024-07-15 07:48:40.585511] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:55.976 [2024-07-15 07:48:40.635417] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.976 [2024-07-15 07:48:40.635439] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:56.546 07:48:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:56.546 07:48:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:56.546 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:56.806 [2024-07-15 07:48:41.432036] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:56.806 [2024-07-15 07:48:41.432066] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:56.806 [2024-07-15 07:48:41.432072] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:56.806 [2024-07-15 07:48:41.432078] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:56.806 [2024-07-15 07:48:41.432082] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:56.806 [2024-07-15 07:48:41.432088] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.806 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.067 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.067 "name": "Existed_Raid", 00:12:57.067 "uuid": "de02ee13-1742-403c-a323-4a5c52eb3c66", 00:12:57.067 "strip_size_kb": 64, 00:12:57.067 "state": "configuring", 00:12:57.067 "raid_level": "raid0", 00:12:57.067 "superblock": true, 00:12:57.067 "num_base_bdevs": 3, 00:12:57.067 "num_base_bdevs_discovered": 0, 00:12:57.067 "num_base_bdevs_operational": 3, 00:12:57.067 "base_bdevs_list": [ 00:12:57.067 { 00:12:57.067 "name": "BaseBdev1", 00:12:57.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.067 "is_configured": false, 00:12:57.067 "data_offset": 0, 00:12:57.067 "data_size": 0 00:12:57.067 }, 00:12:57.067 { 00:12:57.067 "name": "BaseBdev2", 00:12:57.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.067 "is_configured": false, 00:12:57.067 "data_offset": 0, 00:12:57.067 "data_size": 0 00:12:57.067 }, 00:12:57.067 { 00:12:57.067 "name": "BaseBdev3", 00:12:57.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.067 "is_configured": false, 00:12:57.067 "data_offset": 0, 00:12:57.067 "data_size": 0 00:12:57.067 } 00:12:57.067 ] 00:12:57.067 }' 00:12:57.067 07:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.067 07:48:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:57.638 07:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:57.638 [2024-07-15 07:48:42.362274] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:57.638 [2024-07-15 07:48:42.362296] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a786d0 name Existed_Raid, state configuring 00:12:57.638 07:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:57.900 [2024-07-15 07:48:42.542765] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:57.900 [2024-07-15 07:48:42.542787] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:57.900 [2024-07-15 07:48:42.542792] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:57.900 [2024-07-15 07:48:42.542798] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:57.900 [2024-07-15 07:48:42.542802] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:57.900 [2024-07-15 07:48:42.542807] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:57.900 07:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:58.161 [2024-07-15 07:48:42.745589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:58.161 BaseBdev1 00:12:58.161 07:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:58.161 07:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:58.161 07:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:58.161 07:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:58.161 07:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:58.161 07:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:58.161 07:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:58.421 07:48:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:58.421 [ 00:12:58.421 { 00:12:58.421 "name": "BaseBdev1", 00:12:58.421 "aliases": [ 00:12:58.421 "7e48b959-7de7-49e1-9be8-1098617dd4b2" 00:12:58.421 ], 00:12:58.421 "product_name": "Malloc disk", 00:12:58.421 "block_size": 512, 00:12:58.421 "num_blocks": 65536, 00:12:58.421 "uuid": "7e48b959-7de7-49e1-9be8-1098617dd4b2", 00:12:58.421 "assigned_rate_limits": { 00:12:58.421 "rw_ios_per_sec": 0, 00:12:58.421 "rw_mbytes_per_sec": 0, 00:12:58.421 "r_mbytes_per_sec": 0, 00:12:58.421 "w_mbytes_per_sec": 0 00:12:58.421 }, 00:12:58.421 "claimed": true, 00:12:58.422 "claim_type": "exclusive_write", 00:12:58.422 "zoned": false, 00:12:58.422 "supported_io_types": { 00:12:58.422 "read": true, 00:12:58.422 "write": true, 00:12:58.422 "unmap": true, 00:12:58.422 "flush": true, 00:12:58.422 "reset": true, 00:12:58.422 "nvme_admin": false, 00:12:58.422 "nvme_io": false, 00:12:58.422 "nvme_io_md": false, 00:12:58.422 "write_zeroes": true, 00:12:58.422 "zcopy": true, 00:12:58.422 "get_zone_info": false, 00:12:58.422 "zone_management": false, 00:12:58.422 "zone_append": false, 00:12:58.422 "compare": false, 00:12:58.422 "compare_and_write": false, 00:12:58.422 "abort": true, 00:12:58.422 "seek_hole": false, 00:12:58.422 "seek_data": false, 00:12:58.422 "copy": true, 00:12:58.422 "nvme_iov_md": false 00:12:58.422 }, 00:12:58.422 "memory_domains": [ 00:12:58.422 { 00:12:58.422 "dma_device_id": "system", 00:12:58.422 "dma_device_type": 1 00:12:58.422 }, 00:12:58.422 { 00:12:58.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.422 "dma_device_type": 2 00:12:58.422 } 00:12:58.422 ], 00:12:58.422 "driver_specific": {} 00:12:58.422 } 00:12:58.422 ] 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.422 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:58.682 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.682 "name": "Existed_Raid", 00:12:58.682 "uuid": "391f239d-6746-4cd2-a8df-3b24e9af46ef", 00:12:58.682 "strip_size_kb": 64, 00:12:58.682 "state": "configuring", 00:12:58.682 "raid_level": "raid0", 00:12:58.682 "superblock": true, 00:12:58.682 "num_base_bdevs": 3, 00:12:58.682 "num_base_bdevs_discovered": 1, 00:12:58.682 "num_base_bdevs_operational": 3, 00:12:58.682 "base_bdevs_list": [ 00:12:58.682 { 00:12:58.682 "name": "BaseBdev1", 00:12:58.682 "uuid": "7e48b959-7de7-49e1-9be8-1098617dd4b2", 00:12:58.682 "is_configured": true, 00:12:58.682 "data_offset": 2048, 00:12:58.682 "data_size": 63488 00:12:58.682 }, 00:12:58.682 { 00:12:58.682 "name": "BaseBdev2", 00:12:58.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.682 "is_configured": false, 00:12:58.682 "data_offset": 0, 00:12:58.682 "data_size": 0 00:12:58.682 }, 00:12:58.682 { 00:12:58.682 "name": "BaseBdev3", 00:12:58.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.682 "is_configured": false, 00:12:58.682 "data_offset": 0, 00:12:58.682 "data_size": 0 00:12:58.682 } 00:12:58.682 ] 00:12:58.682 }' 00:12:58.682 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.682 07:48:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:59.249 07:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:59.508 [2024-07-15 07:48:44.048875] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:59.508 [2024-07-15 07:48:44.048904] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a77fa0 name Existed_Raid, state configuring 00:12:59.508 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:59.508 [2024-07-15 07:48:44.233375] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:59.508 [2024-07-15 07:48:44.234505] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:59.508 [2024-07-15 07:48:44.234530] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:59.508 [2024-07-15 07:48:44.234537] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:59.508 [2024-07-15 07:48:44.234543] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:59.508 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:59.508 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:59.508 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.509 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:59.768 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.768 "name": "Existed_Raid", 00:12:59.768 "uuid": "f024136e-df10-4e98-84e0-005f1c6b5955", 00:12:59.768 "strip_size_kb": 64, 00:12:59.768 "state": "configuring", 00:12:59.768 "raid_level": "raid0", 00:12:59.768 "superblock": true, 00:12:59.768 "num_base_bdevs": 3, 00:12:59.768 "num_base_bdevs_discovered": 1, 00:12:59.768 "num_base_bdevs_operational": 3, 00:12:59.768 "base_bdevs_list": [ 00:12:59.768 { 00:12:59.768 "name": "BaseBdev1", 00:12:59.768 "uuid": "7e48b959-7de7-49e1-9be8-1098617dd4b2", 00:12:59.768 "is_configured": true, 00:12:59.768 "data_offset": 2048, 00:12:59.768 "data_size": 63488 00:12:59.768 }, 00:12:59.768 { 00:12:59.768 "name": "BaseBdev2", 00:12:59.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.768 "is_configured": false, 00:12:59.768 "data_offset": 0, 00:12:59.768 "data_size": 0 00:12:59.768 }, 00:12:59.768 { 00:12:59.768 "name": "BaseBdev3", 00:12:59.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.768 "is_configured": false, 00:12:59.768 "data_offset": 0, 00:12:59.768 "data_size": 0 00:12:59.768 } 00:12:59.768 ] 00:12:59.768 }' 00:12:59.768 07:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.768 07:48:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:00.355 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:00.617 [2024-07-15 07:48:45.184581] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:00.617 BaseBdev2 00:13:00.617 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:00.617 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:00.617 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:00.617 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:00.617 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:00.617 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:00.617 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:00.884 [ 00:13:00.884 { 00:13:00.884 "name": "BaseBdev2", 00:13:00.884 "aliases": [ 00:13:00.884 "b57b5b67-cc3a-460b-9fa5-c9ff254a2373" 00:13:00.884 ], 00:13:00.884 "product_name": "Malloc disk", 00:13:00.884 "block_size": 512, 00:13:00.884 "num_blocks": 65536, 00:13:00.884 "uuid": "b57b5b67-cc3a-460b-9fa5-c9ff254a2373", 00:13:00.884 "assigned_rate_limits": { 00:13:00.884 "rw_ios_per_sec": 0, 00:13:00.884 "rw_mbytes_per_sec": 0, 00:13:00.884 "r_mbytes_per_sec": 0, 00:13:00.884 "w_mbytes_per_sec": 0 00:13:00.884 }, 00:13:00.884 "claimed": true, 00:13:00.884 "claim_type": "exclusive_write", 00:13:00.884 "zoned": false, 00:13:00.884 "supported_io_types": { 00:13:00.884 "read": true, 00:13:00.884 "write": true, 00:13:00.884 "unmap": true, 00:13:00.884 "flush": true, 00:13:00.884 "reset": true, 00:13:00.884 "nvme_admin": false, 00:13:00.884 "nvme_io": false, 00:13:00.884 "nvme_io_md": false, 00:13:00.884 "write_zeroes": true, 00:13:00.884 "zcopy": true, 00:13:00.884 "get_zone_info": false, 00:13:00.884 "zone_management": false, 00:13:00.884 "zone_append": false, 00:13:00.884 "compare": false, 00:13:00.884 "compare_and_write": false, 00:13:00.884 "abort": true, 00:13:00.884 "seek_hole": false, 00:13:00.884 "seek_data": false, 00:13:00.884 "copy": true, 00:13:00.884 "nvme_iov_md": false 00:13:00.884 }, 00:13:00.884 "memory_domains": [ 00:13:00.884 { 00:13:00.884 "dma_device_id": "system", 00:13:00.884 "dma_device_type": 1 00:13:00.884 }, 00:13:00.884 { 00:13:00.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.884 "dma_device_type": 2 00:13:00.884 } 00:13:00.884 ], 00:13:00.884 "driver_specific": {} 00:13:00.884 } 00:13:00.884 ] 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.884 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.199 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.199 "name": "Existed_Raid", 00:13:01.199 "uuid": "f024136e-df10-4e98-84e0-005f1c6b5955", 00:13:01.199 "strip_size_kb": 64, 00:13:01.199 "state": "configuring", 00:13:01.199 "raid_level": "raid0", 00:13:01.199 "superblock": true, 00:13:01.199 "num_base_bdevs": 3, 00:13:01.199 "num_base_bdevs_discovered": 2, 00:13:01.199 "num_base_bdevs_operational": 3, 00:13:01.199 "base_bdevs_list": [ 00:13:01.199 { 00:13:01.199 "name": "BaseBdev1", 00:13:01.199 "uuid": "7e48b959-7de7-49e1-9be8-1098617dd4b2", 00:13:01.199 "is_configured": true, 00:13:01.199 "data_offset": 2048, 00:13:01.199 "data_size": 63488 00:13:01.199 }, 00:13:01.199 { 00:13:01.199 "name": "BaseBdev2", 00:13:01.199 "uuid": "b57b5b67-cc3a-460b-9fa5-c9ff254a2373", 00:13:01.199 "is_configured": true, 00:13:01.199 "data_offset": 2048, 00:13:01.199 "data_size": 63488 00:13:01.199 }, 00:13:01.199 { 00:13:01.199 "name": "BaseBdev3", 00:13:01.199 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:01.199 "is_configured": false, 00:13:01.199 "data_offset": 0, 00:13:01.199 "data_size": 0 00:13:01.199 } 00:13:01.199 ] 00:13:01.199 }' 00:13:01.199 07:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.199 07:48:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:01.770 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:01.770 [2024-07-15 07:48:46.501019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:01.770 [2024-07-15 07:48:46.501138] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a78e90 00:13:01.770 [2024-07-15 07:48:46.501147] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:01.770 [2024-07-15 07:48:46.501284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a78b60 00:13:01.770 [2024-07-15 07:48:46.501375] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a78e90 00:13:01.770 [2024-07-15 07:48:46.501381] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a78e90 00:13:01.770 [2024-07-15 07:48:46.501447] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:01.770 BaseBdev3 00:13:02.031 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:02.031 07:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:02.031 07:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:02.031 07:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:02.031 07:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:02.031 07:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:02.031 07:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:02.031 07:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:02.291 [ 00:13:02.291 { 00:13:02.291 "name": "BaseBdev3", 00:13:02.291 "aliases": [ 00:13:02.291 "b99a3f3f-fa9c-4cfa-b102-3001848c4998" 00:13:02.291 ], 00:13:02.291 "product_name": "Malloc disk", 00:13:02.291 "block_size": 512, 00:13:02.291 "num_blocks": 65536, 00:13:02.291 "uuid": "b99a3f3f-fa9c-4cfa-b102-3001848c4998", 00:13:02.291 "assigned_rate_limits": { 00:13:02.291 "rw_ios_per_sec": 0, 00:13:02.291 "rw_mbytes_per_sec": 0, 00:13:02.291 "r_mbytes_per_sec": 0, 00:13:02.291 "w_mbytes_per_sec": 0 00:13:02.291 }, 00:13:02.291 "claimed": true, 00:13:02.291 "claim_type": "exclusive_write", 00:13:02.291 "zoned": false, 00:13:02.291 "supported_io_types": { 00:13:02.291 "read": true, 00:13:02.291 "write": true, 00:13:02.291 "unmap": true, 00:13:02.291 "flush": true, 00:13:02.291 "reset": true, 00:13:02.291 "nvme_admin": false, 00:13:02.291 "nvme_io": false, 00:13:02.291 "nvme_io_md": false, 00:13:02.291 "write_zeroes": true, 00:13:02.291 "zcopy": true, 00:13:02.291 "get_zone_info": false, 00:13:02.291 "zone_management": false, 00:13:02.291 "zone_append": false, 00:13:02.291 "compare": false, 00:13:02.291 "compare_and_write": false, 00:13:02.291 "abort": true, 00:13:02.291 "seek_hole": false, 00:13:02.291 "seek_data": false, 00:13:02.291 "copy": true, 00:13:02.291 "nvme_iov_md": false 00:13:02.291 }, 00:13:02.291 "memory_domains": [ 00:13:02.291 { 00:13:02.291 "dma_device_id": "system", 00:13:02.291 "dma_device_type": 1 00:13:02.291 }, 00:13:02.291 { 00:13:02.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.291 "dma_device_type": 2 00:13:02.291 } 00:13:02.291 ], 00:13:02.291 "driver_specific": {} 00:13:02.291 } 00:13:02.291 ] 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.291 07:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.551 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.551 "name": "Existed_Raid", 00:13:02.551 "uuid": "f024136e-df10-4e98-84e0-005f1c6b5955", 00:13:02.551 "strip_size_kb": 64, 00:13:02.551 "state": "online", 00:13:02.551 "raid_level": "raid0", 00:13:02.551 "superblock": true, 00:13:02.551 "num_base_bdevs": 3, 00:13:02.551 "num_base_bdevs_discovered": 3, 00:13:02.551 "num_base_bdevs_operational": 3, 00:13:02.551 "base_bdevs_list": [ 00:13:02.551 { 00:13:02.551 "name": "BaseBdev1", 00:13:02.551 "uuid": "7e48b959-7de7-49e1-9be8-1098617dd4b2", 00:13:02.551 "is_configured": true, 00:13:02.551 "data_offset": 2048, 00:13:02.551 "data_size": 63488 00:13:02.551 }, 00:13:02.551 { 00:13:02.551 "name": "BaseBdev2", 00:13:02.551 "uuid": "b57b5b67-cc3a-460b-9fa5-c9ff254a2373", 00:13:02.551 "is_configured": true, 00:13:02.551 "data_offset": 2048, 00:13:02.551 "data_size": 63488 00:13:02.551 }, 00:13:02.551 { 00:13:02.551 "name": "BaseBdev3", 00:13:02.551 "uuid": "b99a3f3f-fa9c-4cfa-b102-3001848c4998", 00:13:02.551 "is_configured": true, 00:13:02.551 "data_offset": 2048, 00:13:02.551 "data_size": 63488 00:13:02.551 } 00:13:02.551 ] 00:13:02.551 }' 00:13:02.551 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.551 07:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:03.122 [2024-07-15 07:48:47.816586] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:03.122 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:03.122 "name": "Existed_Raid", 00:13:03.123 "aliases": [ 00:13:03.123 "f024136e-df10-4e98-84e0-005f1c6b5955" 00:13:03.123 ], 00:13:03.123 "product_name": "Raid Volume", 00:13:03.123 "block_size": 512, 00:13:03.123 "num_blocks": 190464, 00:13:03.123 "uuid": "f024136e-df10-4e98-84e0-005f1c6b5955", 00:13:03.123 "assigned_rate_limits": { 00:13:03.123 "rw_ios_per_sec": 0, 00:13:03.123 "rw_mbytes_per_sec": 0, 00:13:03.123 "r_mbytes_per_sec": 0, 00:13:03.123 "w_mbytes_per_sec": 0 00:13:03.123 }, 00:13:03.123 "claimed": false, 00:13:03.123 "zoned": false, 00:13:03.123 "supported_io_types": { 00:13:03.123 "read": true, 00:13:03.123 "write": true, 00:13:03.123 "unmap": true, 00:13:03.123 "flush": true, 00:13:03.123 "reset": true, 00:13:03.123 "nvme_admin": false, 00:13:03.123 "nvme_io": false, 00:13:03.123 "nvme_io_md": false, 00:13:03.123 "write_zeroes": true, 00:13:03.123 "zcopy": false, 00:13:03.123 "get_zone_info": false, 00:13:03.123 "zone_management": false, 00:13:03.123 "zone_append": false, 00:13:03.123 "compare": false, 00:13:03.123 "compare_and_write": false, 00:13:03.123 "abort": false, 00:13:03.123 "seek_hole": false, 00:13:03.123 "seek_data": false, 00:13:03.123 "copy": false, 00:13:03.123 "nvme_iov_md": false 00:13:03.123 }, 00:13:03.123 "memory_domains": [ 00:13:03.123 { 00:13:03.123 "dma_device_id": "system", 00:13:03.123 "dma_device_type": 1 00:13:03.123 }, 00:13:03.123 { 00:13:03.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.123 "dma_device_type": 2 00:13:03.123 }, 00:13:03.123 { 00:13:03.123 "dma_device_id": "system", 00:13:03.123 "dma_device_type": 1 00:13:03.123 }, 00:13:03.123 { 00:13:03.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.123 "dma_device_type": 2 00:13:03.123 }, 00:13:03.123 { 00:13:03.123 "dma_device_id": "system", 00:13:03.123 "dma_device_type": 1 00:13:03.123 }, 00:13:03.123 { 00:13:03.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.123 "dma_device_type": 2 00:13:03.123 } 00:13:03.123 ], 00:13:03.123 "driver_specific": { 00:13:03.123 "raid": { 00:13:03.123 "uuid": "f024136e-df10-4e98-84e0-005f1c6b5955", 00:13:03.123 "strip_size_kb": 64, 00:13:03.123 "state": "online", 00:13:03.123 "raid_level": "raid0", 00:13:03.123 "superblock": true, 00:13:03.123 "num_base_bdevs": 3, 00:13:03.123 "num_base_bdevs_discovered": 3, 00:13:03.123 "num_base_bdevs_operational": 3, 00:13:03.123 "base_bdevs_list": [ 00:13:03.123 { 00:13:03.123 "name": "BaseBdev1", 00:13:03.123 "uuid": "7e48b959-7de7-49e1-9be8-1098617dd4b2", 00:13:03.123 "is_configured": true, 00:13:03.123 "data_offset": 2048, 00:13:03.123 "data_size": 63488 00:13:03.123 }, 00:13:03.123 { 00:13:03.123 "name": "BaseBdev2", 00:13:03.123 "uuid": "b57b5b67-cc3a-460b-9fa5-c9ff254a2373", 00:13:03.123 "is_configured": true, 00:13:03.123 "data_offset": 2048, 00:13:03.123 "data_size": 63488 00:13:03.123 }, 00:13:03.123 { 00:13:03.123 "name": "BaseBdev3", 00:13:03.123 "uuid": "b99a3f3f-fa9c-4cfa-b102-3001848c4998", 00:13:03.123 "is_configured": true, 00:13:03.123 "data_offset": 2048, 00:13:03.123 "data_size": 63488 00:13:03.123 } 00:13:03.123 ] 00:13:03.123 } 00:13:03.123 } 00:13:03.123 }' 00:13:03.123 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:03.383 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:03.383 BaseBdev2 00:13:03.383 BaseBdev3' 00:13:03.383 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:03.383 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:03.383 07:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:03.383 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:03.383 "name": "BaseBdev1", 00:13:03.383 "aliases": [ 00:13:03.383 "7e48b959-7de7-49e1-9be8-1098617dd4b2" 00:13:03.383 ], 00:13:03.383 "product_name": "Malloc disk", 00:13:03.383 "block_size": 512, 00:13:03.383 "num_blocks": 65536, 00:13:03.383 "uuid": "7e48b959-7de7-49e1-9be8-1098617dd4b2", 00:13:03.383 "assigned_rate_limits": { 00:13:03.383 "rw_ios_per_sec": 0, 00:13:03.383 "rw_mbytes_per_sec": 0, 00:13:03.383 "r_mbytes_per_sec": 0, 00:13:03.383 "w_mbytes_per_sec": 0 00:13:03.383 }, 00:13:03.383 "claimed": true, 00:13:03.383 "claim_type": "exclusive_write", 00:13:03.383 "zoned": false, 00:13:03.383 "supported_io_types": { 00:13:03.383 "read": true, 00:13:03.383 "write": true, 00:13:03.383 "unmap": true, 00:13:03.383 "flush": true, 00:13:03.383 "reset": true, 00:13:03.383 "nvme_admin": false, 00:13:03.383 "nvme_io": false, 00:13:03.383 "nvme_io_md": false, 00:13:03.383 "write_zeroes": true, 00:13:03.383 "zcopy": true, 00:13:03.383 "get_zone_info": false, 00:13:03.383 "zone_management": false, 00:13:03.383 "zone_append": false, 00:13:03.383 "compare": false, 00:13:03.383 "compare_and_write": false, 00:13:03.384 "abort": true, 00:13:03.384 "seek_hole": false, 00:13:03.384 "seek_data": false, 00:13:03.384 "copy": true, 00:13:03.384 "nvme_iov_md": false 00:13:03.384 }, 00:13:03.384 "memory_domains": [ 00:13:03.384 { 00:13:03.384 "dma_device_id": "system", 00:13:03.384 "dma_device_type": 1 00:13:03.384 }, 00:13:03.384 { 00:13:03.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.384 "dma_device_type": 2 00:13:03.384 } 00:13:03.384 ], 00:13:03.384 "driver_specific": {} 00:13:03.384 }' 00:13:03.384 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.384 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.645 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.906 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.906 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:03.906 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:03.906 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:03.906 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:03.906 "name": "BaseBdev2", 00:13:03.906 "aliases": [ 00:13:03.906 "b57b5b67-cc3a-460b-9fa5-c9ff254a2373" 00:13:03.906 ], 00:13:03.906 "product_name": "Malloc disk", 00:13:03.906 "block_size": 512, 00:13:03.906 "num_blocks": 65536, 00:13:03.906 "uuid": "b57b5b67-cc3a-460b-9fa5-c9ff254a2373", 00:13:03.906 "assigned_rate_limits": { 00:13:03.906 "rw_ios_per_sec": 0, 00:13:03.906 "rw_mbytes_per_sec": 0, 00:13:03.906 "r_mbytes_per_sec": 0, 00:13:03.906 "w_mbytes_per_sec": 0 00:13:03.906 }, 00:13:03.906 "claimed": true, 00:13:03.906 "claim_type": "exclusive_write", 00:13:03.906 "zoned": false, 00:13:03.906 "supported_io_types": { 00:13:03.906 "read": true, 00:13:03.906 "write": true, 00:13:03.906 "unmap": true, 00:13:03.906 "flush": true, 00:13:03.906 "reset": true, 00:13:03.906 "nvme_admin": false, 00:13:03.906 "nvme_io": false, 00:13:03.906 "nvme_io_md": false, 00:13:03.906 "write_zeroes": true, 00:13:03.906 "zcopy": true, 00:13:03.906 "get_zone_info": false, 00:13:03.906 "zone_management": false, 00:13:03.906 "zone_append": false, 00:13:03.906 "compare": false, 00:13:03.906 "compare_and_write": false, 00:13:03.906 "abort": true, 00:13:03.906 "seek_hole": false, 00:13:03.906 "seek_data": false, 00:13:03.906 "copy": true, 00:13:03.906 "nvme_iov_md": false 00:13:03.906 }, 00:13:03.906 "memory_domains": [ 00:13:03.906 { 00:13:03.906 "dma_device_id": "system", 00:13:03.906 "dma_device_type": 1 00:13:03.906 }, 00:13:03.906 { 00:13:03.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.906 "dma_device_type": 2 00:13:03.906 } 00:13:03.906 ], 00:13:03.906 "driver_specific": {} 00:13:03.906 }' 00:13:03.906 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.906 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.165 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.424 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.424 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:04.424 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:04.424 07:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.424 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.424 "name": "BaseBdev3", 00:13:04.424 "aliases": [ 00:13:04.424 "b99a3f3f-fa9c-4cfa-b102-3001848c4998" 00:13:04.424 ], 00:13:04.424 "product_name": "Malloc disk", 00:13:04.424 "block_size": 512, 00:13:04.424 "num_blocks": 65536, 00:13:04.424 "uuid": "b99a3f3f-fa9c-4cfa-b102-3001848c4998", 00:13:04.424 "assigned_rate_limits": { 00:13:04.424 "rw_ios_per_sec": 0, 00:13:04.424 "rw_mbytes_per_sec": 0, 00:13:04.424 "r_mbytes_per_sec": 0, 00:13:04.424 "w_mbytes_per_sec": 0 00:13:04.424 }, 00:13:04.424 "claimed": true, 00:13:04.424 "claim_type": "exclusive_write", 00:13:04.424 "zoned": false, 00:13:04.424 "supported_io_types": { 00:13:04.424 "read": true, 00:13:04.424 "write": true, 00:13:04.424 "unmap": true, 00:13:04.424 "flush": true, 00:13:04.424 "reset": true, 00:13:04.424 "nvme_admin": false, 00:13:04.424 "nvme_io": false, 00:13:04.424 "nvme_io_md": false, 00:13:04.424 "write_zeroes": true, 00:13:04.424 "zcopy": true, 00:13:04.424 "get_zone_info": false, 00:13:04.424 "zone_management": false, 00:13:04.424 "zone_append": false, 00:13:04.424 "compare": false, 00:13:04.424 "compare_and_write": false, 00:13:04.424 "abort": true, 00:13:04.424 "seek_hole": false, 00:13:04.424 "seek_data": false, 00:13:04.424 "copy": true, 00:13:04.424 "nvme_iov_md": false 00:13:04.424 }, 00:13:04.424 "memory_domains": [ 00:13:04.424 { 00:13:04.424 "dma_device_id": "system", 00:13:04.424 "dma_device_type": 1 00:13:04.424 }, 00:13:04.424 { 00:13:04.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.424 "dma_device_type": 2 00:13:04.424 } 00:13:04.424 ], 00:13:04.425 "driver_specific": {} 00:13:04.425 }' 00:13:04.425 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.425 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.683 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:04.943 [2024-07-15 07:48:49.624959] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:04.943 [2024-07-15 07:48:49.624979] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:04.943 [2024-07-15 07:48:49.625010] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.943 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.202 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.202 "name": "Existed_Raid", 00:13:05.202 "uuid": "f024136e-df10-4e98-84e0-005f1c6b5955", 00:13:05.202 "strip_size_kb": 64, 00:13:05.202 "state": "offline", 00:13:05.202 "raid_level": "raid0", 00:13:05.202 "superblock": true, 00:13:05.202 "num_base_bdevs": 3, 00:13:05.202 "num_base_bdevs_discovered": 2, 00:13:05.202 "num_base_bdevs_operational": 2, 00:13:05.202 "base_bdevs_list": [ 00:13:05.202 { 00:13:05.202 "name": null, 00:13:05.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.202 "is_configured": false, 00:13:05.202 "data_offset": 2048, 00:13:05.202 "data_size": 63488 00:13:05.202 }, 00:13:05.202 { 00:13:05.202 "name": "BaseBdev2", 00:13:05.202 "uuid": "b57b5b67-cc3a-460b-9fa5-c9ff254a2373", 00:13:05.202 "is_configured": true, 00:13:05.202 "data_offset": 2048, 00:13:05.202 "data_size": 63488 00:13:05.202 }, 00:13:05.202 { 00:13:05.202 "name": "BaseBdev3", 00:13:05.202 "uuid": "b99a3f3f-fa9c-4cfa-b102-3001848c4998", 00:13:05.202 "is_configured": true, 00:13:05.202 "data_offset": 2048, 00:13:05.202 "data_size": 63488 00:13:05.202 } 00:13:05.202 ] 00:13:05.202 }' 00:13:05.202 07:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.202 07:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:05.770 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:05.770 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:05.770 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.770 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:06.029 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:06.029 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:06.029 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:06.029 [2024-07-15 07:48:50.751827] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:06.029 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:06.029 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:06.029 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.029 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:06.288 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:06.289 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:06.289 07:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:06.548 [2024-07-15 07:48:51.138613] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:06.548 [2024-07-15 07:48:51.138649] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a78e90 name Existed_Raid, state offline 00:13:06.548 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:06.548 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:06.548 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.548 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:06.807 BaseBdev2 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:06.807 07:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:07.065 07:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:07.325 [ 00:13:07.325 { 00:13:07.325 "name": "BaseBdev2", 00:13:07.325 "aliases": [ 00:13:07.325 "8479f7a6-24d0-41c0-ac01-6072d7391b06" 00:13:07.325 ], 00:13:07.325 "product_name": "Malloc disk", 00:13:07.325 "block_size": 512, 00:13:07.325 "num_blocks": 65536, 00:13:07.325 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:07.325 "assigned_rate_limits": { 00:13:07.325 "rw_ios_per_sec": 0, 00:13:07.325 "rw_mbytes_per_sec": 0, 00:13:07.325 "r_mbytes_per_sec": 0, 00:13:07.325 "w_mbytes_per_sec": 0 00:13:07.325 }, 00:13:07.325 "claimed": false, 00:13:07.325 "zoned": false, 00:13:07.325 "supported_io_types": { 00:13:07.325 "read": true, 00:13:07.325 "write": true, 00:13:07.325 "unmap": true, 00:13:07.325 "flush": true, 00:13:07.325 "reset": true, 00:13:07.325 "nvme_admin": false, 00:13:07.326 "nvme_io": false, 00:13:07.326 "nvme_io_md": false, 00:13:07.326 "write_zeroes": true, 00:13:07.326 "zcopy": true, 00:13:07.326 "get_zone_info": false, 00:13:07.326 "zone_management": false, 00:13:07.326 "zone_append": false, 00:13:07.326 "compare": false, 00:13:07.326 "compare_and_write": false, 00:13:07.326 "abort": true, 00:13:07.326 "seek_hole": false, 00:13:07.326 "seek_data": false, 00:13:07.326 "copy": true, 00:13:07.326 "nvme_iov_md": false 00:13:07.326 }, 00:13:07.326 "memory_domains": [ 00:13:07.326 { 00:13:07.326 "dma_device_id": "system", 00:13:07.326 "dma_device_type": 1 00:13:07.326 }, 00:13:07.326 { 00:13:07.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:07.326 "dma_device_type": 2 00:13:07.326 } 00:13:07.326 ], 00:13:07.326 "driver_specific": {} 00:13:07.326 } 00:13:07.326 ] 00:13:07.326 07:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:07.326 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:07.326 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:07.326 07:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:07.586 BaseBdev3 00:13:07.586 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:07.586 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:07.586 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:07.586 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:07.586 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:07.586 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:07.586 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:07.586 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:07.846 [ 00:13:07.846 { 00:13:07.846 "name": "BaseBdev3", 00:13:07.846 "aliases": [ 00:13:07.846 "eab6f928-bdb2-40ac-b980-12ad69dbadd2" 00:13:07.846 ], 00:13:07.846 "product_name": "Malloc disk", 00:13:07.846 "block_size": 512, 00:13:07.846 "num_blocks": 65536, 00:13:07.846 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:07.846 "assigned_rate_limits": { 00:13:07.846 "rw_ios_per_sec": 0, 00:13:07.846 "rw_mbytes_per_sec": 0, 00:13:07.846 "r_mbytes_per_sec": 0, 00:13:07.846 "w_mbytes_per_sec": 0 00:13:07.846 }, 00:13:07.846 "claimed": false, 00:13:07.846 "zoned": false, 00:13:07.846 "supported_io_types": { 00:13:07.846 "read": true, 00:13:07.846 "write": true, 00:13:07.846 "unmap": true, 00:13:07.846 "flush": true, 00:13:07.846 "reset": true, 00:13:07.846 "nvme_admin": false, 00:13:07.846 "nvme_io": false, 00:13:07.846 "nvme_io_md": false, 00:13:07.846 "write_zeroes": true, 00:13:07.846 "zcopy": true, 00:13:07.846 "get_zone_info": false, 00:13:07.846 "zone_management": false, 00:13:07.846 "zone_append": false, 00:13:07.846 "compare": false, 00:13:07.846 "compare_and_write": false, 00:13:07.846 "abort": true, 00:13:07.846 "seek_hole": false, 00:13:07.846 "seek_data": false, 00:13:07.846 "copy": true, 00:13:07.846 "nvme_iov_md": false 00:13:07.846 }, 00:13:07.846 "memory_domains": [ 00:13:07.846 { 00:13:07.846 "dma_device_id": "system", 00:13:07.846 "dma_device_type": 1 00:13:07.846 }, 00:13:07.846 { 00:13:07.846 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:07.846 "dma_device_type": 2 00:13:07.846 } 00:13:07.846 ], 00:13:07.846 "driver_specific": {} 00:13:07.846 } 00:13:07.846 ] 00:13:07.846 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:07.846 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:07.846 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:07.846 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:08.107 [2024-07-15 07:48:52.670375] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:08.107 [2024-07-15 07:48:52.670406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:08.107 [2024-07-15 07:48:52.670418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:08.107 [2024-07-15 07:48:52.671459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.107 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.367 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.367 "name": "Existed_Raid", 00:13:08.367 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:08.367 "strip_size_kb": 64, 00:13:08.367 "state": "configuring", 00:13:08.367 "raid_level": "raid0", 00:13:08.367 "superblock": true, 00:13:08.367 "num_base_bdevs": 3, 00:13:08.367 "num_base_bdevs_discovered": 2, 00:13:08.367 "num_base_bdevs_operational": 3, 00:13:08.367 "base_bdevs_list": [ 00:13:08.367 { 00:13:08.367 "name": "BaseBdev1", 00:13:08.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.367 "is_configured": false, 00:13:08.367 "data_offset": 0, 00:13:08.367 "data_size": 0 00:13:08.367 }, 00:13:08.367 { 00:13:08.367 "name": "BaseBdev2", 00:13:08.367 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:08.367 "is_configured": true, 00:13:08.367 "data_offset": 2048, 00:13:08.367 "data_size": 63488 00:13:08.367 }, 00:13:08.367 { 00:13:08.367 "name": "BaseBdev3", 00:13:08.367 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:08.367 "is_configured": true, 00:13:08.367 "data_offset": 2048, 00:13:08.367 "data_size": 63488 00:13:08.367 } 00:13:08.367 ] 00:13:08.367 }' 00:13:08.367 07:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.367 07:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:08.955 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:08.955 [2024-07-15 07:48:53.568624] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:08.955 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:08.955 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.955 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.956 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.215 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.215 "name": "Existed_Raid", 00:13:09.215 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:09.215 "strip_size_kb": 64, 00:13:09.215 "state": "configuring", 00:13:09.215 "raid_level": "raid0", 00:13:09.215 "superblock": true, 00:13:09.215 "num_base_bdevs": 3, 00:13:09.215 "num_base_bdevs_discovered": 1, 00:13:09.215 "num_base_bdevs_operational": 3, 00:13:09.215 "base_bdevs_list": [ 00:13:09.215 { 00:13:09.215 "name": "BaseBdev1", 00:13:09.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.215 "is_configured": false, 00:13:09.215 "data_offset": 0, 00:13:09.215 "data_size": 0 00:13:09.215 }, 00:13:09.215 { 00:13:09.215 "name": null, 00:13:09.215 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:09.215 "is_configured": false, 00:13:09.215 "data_offset": 2048, 00:13:09.215 "data_size": 63488 00:13:09.215 }, 00:13:09.215 { 00:13:09.215 "name": "BaseBdev3", 00:13:09.215 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:09.215 "is_configured": true, 00:13:09.215 "data_offset": 2048, 00:13:09.215 "data_size": 63488 00:13:09.215 } 00:13:09.215 ] 00:13:09.215 }' 00:13:09.215 07:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.215 07:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:10.156 07:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.157 07:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:10.157 07:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:10.157 07:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:10.417 [2024-07-15 07:48:55.057539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:10.417 BaseBdev1 00:13:10.417 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:10.417 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:10.417 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:10.417 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:10.417 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:10.417 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:10.417 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:10.677 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:10.938 [ 00:13:10.938 { 00:13:10.938 "name": "BaseBdev1", 00:13:10.938 "aliases": [ 00:13:10.938 "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88" 00:13:10.938 ], 00:13:10.938 "product_name": "Malloc disk", 00:13:10.938 "block_size": 512, 00:13:10.938 "num_blocks": 65536, 00:13:10.938 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:10.938 "assigned_rate_limits": { 00:13:10.938 "rw_ios_per_sec": 0, 00:13:10.938 "rw_mbytes_per_sec": 0, 00:13:10.938 "r_mbytes_per_sec": 0, 00:13:10.938 "w_mbytes_per_sec": 0 00:13:10.938 }, 00:13:10.938 "claimed": true, 00:13:10.938 "claim_type": "exclusive_write", 00:13:10.938 "zoned": false, 00:13:10.938 "supported_io_types": { 00:13:10.938 "read": true, 00:13:10.938 "write": true, 00:13:10.938 "unmap": true, 00:13:10.938 "flush": true, 00:13:10.938 "reset": true, 00:13:10.938 "nvme_admin": false, 00:13:10.938 "nvme_io": false, 00:13:10.938 "nvme_io_md": false, 00:13:10.938 "write_zeroes": true, 00:13:10.938 "zcopy": true, 00:13:10.938 "get_zone_info": false, 00:13:10.938 "zone_management": false, 00:13:10.938 "zone_append": false, 00:13:10.938 "compare": false, 00:13:10.938 "compare_and_write": false, 00:13:10.938 "abort": true, 00:13:10.938 "seek_hole": false, 00:13:10.938 "seek_data": false, 00:13:10.938 "copy": true, 00:13:10.938 "nvme_iov_md": false 00:13:10.938 }, 00:13:10.938 "memory_domains": [ 00:13:10.938 { 00:13:10.938 "dma_device_id": "system", 00:13:10.938 "dma_device_type": 1 00:13:10.938 }, 00:13:10.938 { 00:13:10.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.938 "dma_device_type": 2 00:13:10.938 } 00:13:10.938 ], 00:13:10.938 "driver_specific": {} 00:13:10.938 } 00:13:10.938 ] 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.938 "name": "Existed_Raid", 00:13:10.938 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:10.938 "strip_size_kb": 64, 00:13:10.938 "state": "configuring", 00:13:10.938 "raid_level": "raid0", 00:13:10.938 "superblock": true, 00:13:10.938 "num_base_bdevs": 3, 00:13:10.938 "num_base_bdevs_discovered": 2, 00:13:10.938 "num_base_bdevs_operational": 3, 00:13:10.938 "base_bdevs_list": [ 00:13:10.938 { 00:13:10.938 "name": "BaseBdev1", 00:13:10.938 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:10.938 "is_configured": true, 00:13:10.938 "data_offset": 2048, 00:13:10.938 "data_size": 63488 00:13:10.938 }, 00:13:10.938 { 00:13:10.938 "name": null, 00:13:10.938 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:10.938 "is_configured": false, 00:13:10.938 "data_offset": 2048, 00:13:10.938 "data_size": 63488 00:13:10.938 }, 00:13:10.938 { 00:13:10.938 "name": "BaseBdev3", 00:13:10.938 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:10.938 "is_configured": true, 00:13:10.938 "data_offset": 2048, 00:13:10.938 "data_size": 63488 00:13:10.938 } 00:13:10.938 ] 00:13:10.938 }' 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.938 07:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:11.509 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.509 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:11.769 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:11.769 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:12.029 [2024-07-15 07:48:56.565512] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.029 "name": "Existed_Raid", 00:13:12.029 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:12.029 "strip_size_kb": 64, 00:13:12.029 "state": "configuring", 00:13:12.029 "raid_level": "raid0", 00:13:12.029 "superblock": true, 00:13:12.029 "num_base_bdevs": 3, 00:13:12.029 "num_base_bdevs_discovered": 1, 00:13:12.029 "num_base_bdevs_operational": 3, 00:13:12.029 "base_bdevs_list": [ 00:13:12.029 { 00:13:12.029 "name": "BaseBdev1", 00:13:12.029 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:12.029 "is_configured": true, 00:13:12.029 "data_offset": 2048, 00:13:12.029 "data_size": 63488 00:13:12.029 }, 00:13:12.029 { 00:13:12.029 "name": null, 00:13:12.029 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:12.029 "is_configured": false, 00:13:12.029 "data_offset": 2048, 00:13:12.029 "data_size": 63488 00:13:12.029 }, 00:13:12.029 { 00:13:12.029 "name": null, 00:13:12.029 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:12.029 "is_configured": false, 00:13:12.029 "data_offset": 2048, 00:13:12.029 "data_size": 63488 00:13:12.029 } 00:13:12.029 ] 00:13:12.029 }' 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.029 07:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:12.600 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.600 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:12.861 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:12.861 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:13.122 [2024-07-15 07:48:57.656327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.122 "name": "Existed_Raid", 00:13:13.122 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:13.122 "strip_size_kb": 64, 00:13:13.122 "state": "configuring", 00:13:13.122 "raid_level": "raid0", 00:13:13.122 "superblock": true, 00:13:13.122 "num_base_bdevs": 3, 00:13:13.122 "num_base_bdevs_discovered": 2, 00:13:13.122 "num_base_bdevs_operational": 3, 00:13:13.122 "base_bdevs_list": [ 00:13:13.122 { 00:13:13.122 "name": "BaseBdev1", 00:13:13.122 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:13.122 "is_configured": true, 00:13:13.122 "data_offset": 2048, 00:13:13.122 "data_size": 63488 00:13:13.122 }, 00:13:13.122 { 00:13:13.122 "name": null, 00:13:13.122 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:13.122 "is_configured": false, 00:13:13.122 "data_offset": 2048, 00:13:13.122 "data_size": 63488 00:13:13.122 }, 00:13:13.122 { 00:13:13.122 "name": "BaseBdev3", 00:13:13.122 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:13.122 "is_configured": true, 00:13:13.122 "data_offset": 2048, 00:13:13.122 "data_size": 63488 00:13:13.122 } 00:13:13.122 ] 00:13:13.122 }' 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.122 07:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:13.692 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.693 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:13.953 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:13.953 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:14.213 [2024-07-15 07:48:58.767155] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.213 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:14.474 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.474 "name": "Existed_Raid", 00:13:14.474 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:14.474 "strip_size_kb": 64, 00:13:14.474 "state": "configuring", 00:13:14.474 "raid_level": "raid0", 00:13:14.474 "superblock": true, 00:13:14.474 "num_base_bdevs": 3, 00:13:14.474 "num_base_bdevs_discovered": 1, 00:13:14.474 "num_base_bdevs_operational": 3, 00:13:14.474 "base_bdevs_list": [ 00:13:14.474 { 00:13:14.474 "name": null, 00:13:14.474 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:14.474 "is_configured": false, 00:13:14.474 "data_offset": 2048, 00:13:14.474 "data_size": 63488 00:13:14.474 }, 00:13:14.474 { 00:13:14.474 "name": null, 00:13:14.474 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:14.474 "is_configured": false, 00:13:14.474 "data_offset": 2048, 00:13:14.474 "data_size": 63488 00:13:14.474 }, 00:13:14.474 { 00:13:14.474 "name": "BaseBdev3", 00:13:14.474 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:14.474 "is_configured": true, 00:13:14.474 "data_offset": 2048, 00:13:14.474 "data_size": 63488 00:13:14.474 } 00:13:14.474 ] 00:13:14.474 }' 00:13:14.474 07:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.474 07:48:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:15.045 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.045 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:15.045 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:15.045 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:15.306 [2024-07-15 07:48:59.879687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.306 07:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.566 07:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.566 "name": "Existed_Raid", 00:13:15.566 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:15.566 "strip_size_kb": 64, 00:13:15.566 "state": "configuring", 00:13:15.566 "raid_level": "raid0", 00:13:15.566 "superblock": true, 00:13:15.566 "num_base_bdevs": 3, 00:13:15.566 "num_base_bdevs_discovered": 2, 00:13:15.566 "num_base_bdevs_operational": 3, 00:13:15.566 "base_bdevs_list": [ 00:13:15.566 { 00:13:15.566 "name": null, 00:13:15.566 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:15.566 "is_configured": false, 00:13:15.566 "data_offset": 2048, 00:13:15.566 "data_size": 63488 00:13:15.566 }, 00:13:15.566 { 00:13:15.566 "name": "BaseBdev2", 00:13:15.566 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:15.566 "is_configured": true, 00:13:15.566 "data_offset": 2048, 00:13:15.566 "data_size": 63488 00:13:15.566 }, 00:13:15.566 { 00:13:15.566 "name": "BaseBdev3", 00:13:15.566 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:15.566 "is_configured": true, 00:13:15.566 "data_offset": 2048, 00:13:15.566 "data_size": 63488 00:13:15.566 } 00:13:15.566 ] 00:13:15.566 }' 00:13:15.566 07:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.566 07:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:16.137 07:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.137 07:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:16.137 07:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:16.137 07:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.137 07:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:16.397 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b0deb1dd-e48c-4d82-ab5c-1606a7d73c88 00:13:16.658 [2024-07-15 07:49:01.184025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:16.658 [2024-07-15 07:49:01.184138] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a773d0 00:13:16.658 [2024-07-15 07:49:01.184145] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:16.658 [2024-07-15 07:49:01.184284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c25c50 00:13:16.658 [2024-07-15 07:49:01.184367] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a773d0 00:13:16.658 [2024-07-15 07:49:01.184372] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a773d0 00:13:16.658 [2024-07-15 07:49:01.184440] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:16.658 NewBaseBdev 00:13:16.658 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:16.658 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:16.658 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:16.658 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:16.658 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:16.658 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:16.658 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.658 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:16.918 [ 00:13:16.918 { 00:13:16.918 "name": "NewBaseBdev", 00:13:16.918 "aliases": [ 00:13:16.918 "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88" 00:13:16.918 ], 00:13:16.918 "product_name": "Malloc disk", 00:13:16.918 "block_size": 512, 00:13:16.918 "num_blocks": 65536, 00:13:16.918 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:16.918 "assigned_rate_limits": { 00:13:16.918 "rw_ios_per_sec": 0, 00:13:16.918 "rw_mbytes_per_sec": 0, 00:13:16.918 "r_mbytes_per_sec": 0, 00:13:16.918 "w_mbytes_per_sec": 0 00:13:16.918 }, 00:13:16.918 "claimed": true, 00:13:16.918 "claim_type": "exclusive_write", 00:13:16.918 "zoned": false, 00:13:16.918 "supported_io_types": { 00:13:16.918 "read": true, 00:13:16.918 "write": true, 00:13:16.918 "unmap": true, 00:13:16.918 "flush": true, 00:13:16.918 "reset": true, 00:13:16.918 "nvme_admin": false, 00:13:16.918 "nvme_io": false, 00:13:16.918 "nvme_io_md": false, 00:13:16.918 "write_zeroes": true, 00:13:16.918 "zcopy": true, 00:13:16.918 "get_zone_info": false, 00:13:16.918 "zone_management": false, 00:13:16.918 "zone_append": false, 00:13:16.918 "compare": false, 00:13:16.918 "compare_and_write": false, 00:13:16.918 "abort": true, 00:13:16.918 "seek_hole": false, 00:13:16.918 "seek_data": false, 00:13:16.918 "copy": true, 00:13:16.918 "nvme_iov_md": false 00:13:16.918 }, 00:13:16.918 "memory_domains": [ 00:13:16.918 { 00:13:16.918 "dma_device_id": "system", 00:13:16.918 "dma_device_type": 1 00:13:16.918 }, 00:13:16.918 { 00:13:16.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.918 "dma_device_type": 2 00:13:16.918 } 00:13:16.918 ], 00:13:16.918 "driver_specific": {} 00:13:16.918 } 00:13:16.918 ] 00:13:16.918 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:16.918 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:16.918 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.918 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.918 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:16.918 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:16.918 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:16.919 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.919 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.919 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.919 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.919 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.919 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:17.179 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.179 "name": "Existed_Raid", 00:13:17.179 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:17.179 "strip_size_kb": 64, 00:13:17.179 "state": "online", 00:13:17.179 "raid_level": "raid0", 00:13:17.179 "superblock": true, 00:13:17.179 "num_base_bdevs": 3, 00:13:17.179 "num_base_bdevs_discovered": 3, 00:13:17.179 "num_base_bdevs_operational": 3, 00:13:17.179 "base_bdevs_list": [ 00:13:17.179 { 00:13:17.179 "name": "NewBaseBdev", 00:13:17.179 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:17.179 "is_configured": true, 00:13:17.179 "data_offset": 2048, 00:13:17.179 "data_size": 63488 00:13:17.179 }, 00:13:17.179 { 00:13:17.179 "name": "BaseBdev2", 00:13:17.179 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:17.179 "is_configured": true, 00:13:17.179 "data_offset": 2048, 00:13:17.179 "data_size": 63488 00:13:17.179 }, 00:13:17.179 { 00:13:17.179 "name": "BaseBdev3", 00:13:17.179 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:17.179 "is_configured": true, 00:13:17.179 "data_offset": 2048, 00:13:17.179 "data_size": 63488 00:13:17.179 } 00:13:17.179 ] 00:13:17.179 }' 00:13:17.179 07:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.179 07:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:17.759 [2024-07-15 07:49:02.467599] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:17.759 "name": "Existed_Raid", 00:13:17.759 "aliases": [ 00:13:17.759 "3fb78b54-1f75-482f-9e0e-46d94ec16344" 00:13:17.759 ], 00:13:17.759 "product_name": "Raid Volume", 00:13:17.759 "block_size": 512, 00:13:17.759 "num_blocks": 190464, 00:13:17.759 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:17.759 "assigned_rate_limits": { 00:13:17.759 "rw_ios_per_sec": 0, 00:13:17.759 "rw_mbytes_per_sec": 0, 00:13:17.759 "r_mbytes_per_sec": 0, 00:13:17.759 "w_mbytes_per_sec": 0 00:13:17.759 }, 00:13:17.759 "claimed": false, 00:13:17.759 "zoned": false, 00:13:17.759 "supported_io_types": { 00:13:17.759 "read": true, 00:13:17.759 "write": true, 00:13:17.759 "unmap": true, 00:13:17.759 "flush": true, 00:13:17.759 "reset": true, 00:13:17.759 "nvme_admin": false, 00:13:17.759 "nvme_io": false, 00:13:17.759 "nvme_io_md": false, 00:13:17.759 "write_zeroes": true, 00:13:17.759 "zcopy": false, 00:13:17.759 "get_zone_info": false, 00:13:17.759 "zone_management": false, 00:13:17.759 "zone_append": false, 00:13:17.759 "compare": false, 00:13:17.759 "compare_and_write": false, 00:13:17.759 "abort": false, 00:13:17.759 "seek_hole": false, 00:13:17.759 "seek_data": false, 00:13:17.759 "copy": false, 00:13:17.759 "nvme_iov_md": false 00:13:17.759 }, 00:13:17.759 "memory_domains": [ 00:13:17.759 { 00:13:17.759 "dma_device_id": "system", 00:13:17.759 "dma_device_type": 1 00:13:17.759 }, 00:13:17.759 { 00:13:17.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.759 "dma_device_type": 2 00:13:17.759 }, 00:13:17.759 { 00:13:17.759 "dma_device_id": "system", 00:13:17.759 "dma_device_type": 1 00:13:17.759 }, 00:13:17.759 { 00:13:17.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.759 "dma_device_type": 2 00:13:17.759 }, 00:13:17.759 { 00:13:17.759 "dma_device_id": "system", 00:13:17.759 "dma_device_type": 1 00:13:17.759 }, 00:13:17.759 { 00:13:17.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.759 "dma_device_type": 2 00:13:17.759 } 00:13:17.759 ], 00:13:17.759 "driver_specific": { 00:13:17.759 "raid": { 00:13:17.759 "uuid": "3fb78b54-1f75-482f-9e0e-46d94ec16344", 00:13:17.759 "strip_size_kb": 64, 00:13:17.759 "state": "online", 00:13:17.759 "raid_level": "raid0", 00:13:17.759 "superblock": true, 00:13:17.759 "num_base_bdevs": 3, 00:13:17.759 "num_base_bdevs_discovered": 3, 00:13:17.759 "num_base_bdevs_operational": 3, 00:13:17.759 "base_bdevs_list": [ 00:13:17.759 { 00:13:17.759 "name": "NewBaseBdev", 00:13:17.759 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:17.759 "is_configured": true, 00:13:17.759 "data_offset": 2048, 00:13:17.759 "data_size": 63488 00:13:17.759 }, 00:13:17.759 { 00:13:17.759 "name": "BaseBdev2", 00:13:17.759 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:17.759 "is_configured": true, 00:13:17.759 "data_offset": 2048, 00:13:17.759 "data_size": 63488 00:13:17.759 }, 00:13:17.759 { 00:13:17.759 "name": "BaseBdev3", 00:13:17.759 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:17.759 "is_configured": true, 00:13:17.759 "data_offset": 2048, 00:13:17.759 "data_size": 63488 00:13:17.759 } 00:13:17.759 ] 00:13:17.759 } 00:13:17.759 } 00:13:17.759 }' 00:13:17.759 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:18.078 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:18.078 BaseBdev2 00:13:18.078 BaseBdev3' 00:13:18.078 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.078 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:18.078 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.078 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.078 "name": "NewBaseBdev", 00:13:18.078 "aliases": [ 00:13:18.078 "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88" 00:13:18.078 ], 00:13:18.078 "product_name": "Malloc disk", 00:13:18.078 "block_size": 512, 00:13:18.078 "num_blocks": 65536, 00:13:18.078 "uuid": "b0deb1dd-e48c-4d82-ab5c-1606a7d73c88", 00:13:18.078 "assigned_rate_limits": { 00:13:18.078 "rw_ios_per_sec": 0, 00:13:18.078 "rw_mbytes_per_sec": 0, 00:13:18.078 "r_mbytes_per_sec": 0, 00:13:18.078 "w_mbytes_per_sec": 0 00:13:18.078 }, 00:13:18.079 "claimed": true, 00:13:18.079 "claim_type": "exclusive_write", 00:13:18.079 "zoned": false, 00:13:18.079 "supported_io_types": { 00:13:18.079 "read": true, 00:13:18.079 "write": true, 00:13:18.079 "unmap": true, 00:13:18.079 "flush": true, 00:13:18.079 "reset": true, 00:13:18.079 "nvme_admin": false, 00:13:18.079 "nvme_io": false, 00:13:18.079 "nvme_io_md": false, 00:13:18.079 "write_zeroes": true, 00:13:18.079 "zcopy": true, 00:13:18.079 "get_zone_info": false, 00:13:18.079 "zone_management": false, 00:13:18.079 "zone_append": false, 00:13:18.079 "compare": false, 00:13:18.079 "compare_and_write": false, 00:13:18.079 "abort": true, 00:13:18.079 "seek_hole": false, 00:13:18.079 "seek_data": false, 00:13:18.079 "copy": true, 00:13:18.079 "nvme_iov_md": false 00:13:18.079 }, 00:13:18.079 "memory_domains": [ 00:13:18.079 { 00:13:18.079 "dma_device_id": "system", 00:13:18.079 "dma_device_type": 1 00:13:18.079 }, 00:13:18.079 { 00:13:18.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.079 "dma_device_type": 2 00:13:18.079 } 00:13:18.079 ], 00:13:18.079 "driver_specific": {} 00:13:18.079 }' 00:13:18.079 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.079 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.079 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.079 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.338 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.338 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.338 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.339 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.339 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.339 07:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.339 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.339 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.339 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.339 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:18.339 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.598 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.598 "name": "BaseBdev2", 00:13:18.598 "aliases": [ 00:13:18.598 "8479f7a6-24d0-41c0-ac01-6072d7391b06" 00:13:18.598 ], 00:13:18.598 "product_name": "Malloc disk", 00:13:18.598 "block_size": 512, 00:13:18.598 "num_blocks": 65536, 00:13:18.598 "uuid": "8479f7a6-24d0-41c0-ac01-6072d7391b06", 00:13:18.598 "assigned_rate_limits": { 00:13:18.598 "rw_ios_per_sec": 0, 00:13:18.598 "rw_mbytes_per_sec": 0, 00:13:18.598 "r_mbytes_per_sec": 0, 00:13:18.598 "w_mbytes_per_sec": 0 00:13:18.598 }, 00:13:18.598 "claimed": true, 00:13:18.598 "claim_type": "exclusive_write", 00:13:18.598 "zoned": false, 00:13:18.598 "supported_io_types": { 00:13:18.598 "read": true, 00:13:18.598 "write": true, 00:13:18.598 "unmap": true, 00:13:18.598 "flush": true, 00:13:18.598 "reset": true, 00:13:18.598 "nvme_admin": false, 00:13:18.598 "nvme_io": false, 00:13:18.598 "nvme_io_md": false, 00:13:18.598 "write_zeroes": true, 00:13:18.598 "zcopy": true, 00:13:18.598 "get_zone_info": false, 00:13:18.598 "zone_management": false, 00:13:18.598 "zone_append": false, 00:13:18.598 "compare": false, 00:13:18.598 "compare_and_write": false, 00:13:18.598 "abort": true, 00:13:18.598 "seek_hole": false, 00:13:18.598 "seek_data": false, 00:13:18.598 "copy": true, 00:13:18.598 "nvme_iov_md": false 00:13:18.598 }, 00:13:18.598 "memory_domains": [ 00:13:18.598 { 00:13:18.598 "dma_device_id": "system", 00:13:18.598 "dma_device_type": 1 00:13:18.598 }, 00:13:18.598 { 00:13:18.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.598 "dma_device_type": 2 00:13:18.598 } 00:13:18.598 ], 00:13:18.598 "driver_specific": {} 00:13:18.598 }' 00:13:18.598 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.598 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:18.859 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:19.120 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:19.120 "name": "BaseBdev3", 00:13:19.120 "aliases": [ 00:13:19.120 "eab6f928-bdb2-40ac-b980-12ad69dbadd2" 00:13:19.120 ], 00:13:19.120 "product_name": "Malloc disk", 00:13:19.120 "block_size": 512, 00:13:19.120 "num_blocks": 65536, 00:13:19.120 "uuid": "eab6f928-bdb2-40ac-b980-12ad69dbadd2", 00:13:19.120 "assigned_rate_limits": { 00:13:19.120 "rw_ios_per_sec": 0, 00:13:19.120 "rw_mbytes_per_sec": 0, 00:13:19.120 "r_mbytes_per_sec": 0, 00:13:19.120 "w_mbytes_per_sec": 0 00:13:19.120 }, 00:13:19.120 "claimed": true, 00:13:19.120 "claim_type": "exclusive_write", 00:13:19.120 "zoned": false, 00:13:19.120 "supported_io_types": { 00:13:19.120 "read": true, 00:13:19.120 "write": true, 00:13:19.120 "unmap": true, 00:13:19.120 "flush": true, 00:13:19.120 "reset": true, 00:13:19.120 "nvme_admin": false, 00:13:19.120 "nvme_io": false, 00:13:19.120 "nvme_io_md": false, 00:13:19.120 "write_zeroes": true, 00:13:19.120 "zcopy": true, 00:13:19.120 "get_zone_info": false, 00:13:19.120 "zone_management": false, 00:13:19.120 "zone_append": false, 00:13:19.120 "compare": false, 00:13:19.120 "compare_and_write": false, 00:13:19.120 "abort": true, 00:13:19.120 "seek_hole": false, 00:13:19.120 "seek_data": false, 00:13:19.120 "copy": true, 00:13:19.120 "nvme_iov_md": false 00:13:19.120 }, 00:13:19.120 "memory_domains": [ 00:13:19.120 { 00:13:19.120 "dma_device_id": "system", 00:13:19.120 "dma_device_type": 1 00:13:19.120 }, 00:13:19.120 { 00:13:19.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.120 "dma_device_type": 2 00:13:19.120 } 00:13:19.120 ], 00:13:19.120 "driver_specific": {} 00:13:19.120 }' 00:13:19.120 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.120 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.120 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:19.120 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.380 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.380 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:19.380 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:19.380 07:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:19.380 07:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:19.380 07:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:19.380 07:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:19.380 07:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:19.380 07:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:19.641 [2024-07-15 07:49:04.304000] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:19.641 [2024-07-15 07:49:04.304019] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:19.641 [2024-07-15 07:49:04.304057] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:19.641 [2024-07-15 07:49:04.304096] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:19.641 [2024-07-15 07:49:04.304102] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a773d0 name Existed_Raid, state offline 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1614193 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1614193 ']' 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1614193 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1614193 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1614193' 00:13:19.641 killing process with pid 1614193 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1614193 00:13:19.641 [2024-07-15 07:49:04.371460] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:19.641 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1614193 00:13:19.641 [2024-07-15 07:49:04.386222] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:19.903 07:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:19.903 00:13:19.903 real 0m24.160s 00:13:19.903 user 0m45.340s 00:13:19.903 sys 0m3.549s 00:13:19.903 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:19.903 07:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:19.903 ************************************ 00:13:19.903 END TEST raid_state_function_test_sb 00:13:19.903 ************************************ 00:13:19.903 07:49:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:19.903 07:49:04 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:13:19.903 07:49:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:19.903 07:49:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:19.903 07:49:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:19.903 ************************************ 00:13:19.903 START TEST raid_superblock_test 00:13:19.903 ************************************ 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1618884 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1618884 /var/tmp/spdk-raid.sock 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1618884 ']' 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:19.903 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:19.903 07:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.903 [2024-07-15 07:49:04.645573] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:13:19.904 [2024-07-15 07:49:04.645630] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618884 ] 00:13:20.165 [2024-07-15 07:49:04.739152] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:20.165 [2024-07-15 07:49:04.816336] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:20.165 [2024-07-15 07:49:04.862107] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:20.165 [2024-07-15 07:49:04.862133] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:21.107 malloc1 00:13:21.107 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:21.107 [2024-07-15 07:49:05.848974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:21.107 [2024-07-15 07:49:05.849011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:21.107 [2024-07-15 07:49:05.849022] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2158a20 00:13:21.107 [2024-07-15 07:49:05.849029] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:21.107 [2024-07-15 07:49:05.850298] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:21.107 [2024-07-15 07:49:05.850317] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:21.107 pt1 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:21.367 07:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:21.367 malloc2 00:13:21.367 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:21.627 [2024-07-15 07:49:06.219857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:21.627 [2024-07-15 07:49:06.219885] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:21.627 [2024-07-15 07:49:06.219897] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2159040 00:13:21.627 [2024-07-15 07:49:06.219903] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:21.627 [2024-07-15 07:49:06.221079] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:21.627 [2024-07-15 07:49:06.221098] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:21.627 pt2 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:21.627 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:21.887 malloc3 00:13:21.887 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:21.887 [2024-07-15 07:49:06.598645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:21.887 [2024-07-15 07:49:06.598674] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:21.887 [2024-07-15 07:49:06.598684] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2159540 00:13:21.887 [2024-07-15 07:49:06.598691] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:21.887 [2024-07-15 07:49:06.599847] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:21.887 [2024-07-15 07:49:06.599866] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:21.887 pt3 00:13:21.887 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:21.887 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:21.887 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:22.147 [2024-07-15 07:49:06.783160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:22.147 [2024-07-15 07:49:06.784530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:22.147 [2024-07-15 07:49:06.784585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:22.147 [2024-07-15 07:49:06.784750] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2305a90 00:13:22.147 [2024-07-15 07:49:06.784760] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:22.147 [2024-07-15 07:49:06.784957] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2301c50 00:13:22.147 [2024-07-15 07:49:06.785104] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2305a90 00:13:22.147 [2024-07-15 07:49:06.785112] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2305a90 00:13:22.147 [2024-07-15 07:49:06.785207] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.147 07:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:22.407 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.407 "name": "raid_bdev1", 00:13:22.407 "uuid": "996e4a01-f36f-4ee9-bdc7-0fbe5586106b", 00:13:22.407 "strip_size_kb": 64, 00:13:22.407 "state": "online", 00:13:22.407 "raid_level": "raid0", 00:13:22.407 "superblock": true, 00:13:22.407 "num_base_bdevs": 3, 00:13:22.407 "num_base_bdevs_discovered": 3, 00:13:22.407 "num_base_bdevs_operational": 3, 00:13:22.407 "base_bdevs_list": [ 00:13:22.407 { 00:13:22.407 "name": "pt1", 00:13:22.407 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:22.407 "is_configured": true, 00:13:22.407 "data_offset": 2048, 00:13:22.407 "data_size": 63488 00:13:22.407 }, 00:13:22.407 { 00:13:22.407 "name": "pt2", 00:13:22.407 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:22.407 "is_configured": true, 00:13:22.407 "data_offset": 2048, 00:13:22.407 "data_size": 63488 00:13:22.407 }, 00:13:22.407 { 00:13:22.407 "name": "pt3", 00:13:22.407 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:22.407 "is_configured": true, 00:13:22.407 "data_offset": 2048, 00:13:22.407 "data_size": 63488 00:13:22.407 } 00:13:22.407 ] 00:13:22.407 }' 00:13:22.407 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.407 07:49:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:22.976 [2024-07-15 07:49:07.709794] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:22.976 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:22.976 "name": "raid_bdev1", 00:13:22.976 "aliases": [ 00:13:22.977 "996e4a01-f36f-4ee9-bdc7-0fbe5586106b" 00:13:22.977 ], 00:13:22.977 "product_name": "Raid Volume", 00:13:22.977 "block_size": 512, 00:13:22.977 "num_blocks": 190464, 00:13:22.977 "uuid": "996e4a01-f36f-4ee9-bdc7-0fbe5586106b", 00:13:22.977 "assigned_rate_limits": { 00:13:22.977 "rw_ios_per_sec": 0, 00:13:22.977 "rw_mbytes_per_sec": 0, 00:13:22.977 "r_mbytes_per_sec": 0, 00:13:22.977 "w_mbytes_per_sec": 0 00:13:22.977 }, 00:13:22.977 "claimed": false, 00:13:22.977 "zoned": false, 00:13:22.977 "supported_io_types": { 00:13:22.977 "read": true, 00:13:22.977 "write": true, 00:13:22.977 "unmap": true, 00:13:22.977 "flush": true, 00:13:22.977 "reset": true, 00:13:22.977 "nvme_admin": false, 00:13:22.977 "nvme_io": false, 00:13:22.977 "nvme_io_md": false, 00:13:22.977 "write_zeroes": true, 00:13:22.977 "zcopy": false, 00:13:22.977 "get_zone_info": false, 00:13:22.977 "zone_management": false, 00:13:22.977 "zone_append": false, 00:13:22.977 "compare": false, 00:13:22.977 "compare_and_write": false, 00:13:22.977 "abort": false, 00:13:22.977 "seek_hole": false, 00:13:22.977 "seek_data": false, 00:13:22.977 "copy": false, 00:13:22.977 "nvme_iov_md": false 00:13:22.977 }, 00:13:22.977 "memory_domains": [ 00:13:22.977 { 00:13:22.977 "dma_device_id": "system", 00:13:22.977 "dma_device_type": 1 00:13:22.977 }, 00:13:22.977 { 00:13:22.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.977 "dma_device_type": 2 00:13:22.977 }, 00:13:22.977 { 00:13:22.977 "dma_device_id": "system", 00:13:22.977 "dma_device_type": 1 00:13:22.977 }, 00:13:22.977 { 00:13:22.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.977 "dma_device_type": 2 00:13:22.977 }, 00:13:22.977 { 00:13:22.977 "dma_device_id": "system", 00:13:22.977 "dma_device_type": 1 00:13:22.977 }, 00:13:22.977 { 00:13:22.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.977 "dma_device_type": 2 00:13:22.977 } 00:13:22.977 ], 00:13:22.977 "driver_specific": { 00:13:22.977 "raid": { 00:13:22.977 "uuid": "996e4a01-f36f-4ee9-bdc7-0fbe5586106b", 00:13:22.977 "strip_size_kb": 64, 00:13:22.977 "state": "online", 00:13:22.977 "raid_level": "raid0", 00:13:22.977 "superblock": true, 00:13:22.977 "num_base_bdevs": 3, 00:13:22.977 "num_base_bdevs_discovered": 3, 00:13:22.977 "num_base_bdevs_operational": 3, 00:13:22.977 "base_bdevs_list": [ 00:13:22.977 { 00:13:22.977 "name": "pt1", 00:13:22.977 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:22.977 "is_configured": true, 00:13:22.977 "data_offset": 2048, 00:13:22.977 "data_size": 63488 00:13:22.977 }, 00:13:22.977 { 00:13:22.977 "name": "pt2", 00:13:22.977 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:22.977 "is_configured": true, 00:13:22.977 "data_offset": 2048, 00:13:22.977 "data_size": 63488 00:13:22.977 }, 00:13:22.977 { 00:13:22.977 "name": "pt3", 00:13:22.977 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:22.977 "is_configured": true, 00:13:22.977 "data_offset": 2048, 00:13:22.977 "data_size": 63488 00:13:22.977 } 00:13:22.977 ] 00:13:22.977 } 00:13:22.977 } 00:13:22.977 }' 00:13:23.237 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:23.237 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:23.237 pt2 00:13:23.237 pt3' 00:13:23.237 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:23.237 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:23.237 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:23.237 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:23.237 "name": "pt1", 00:13:23.237 "aliases": [ 00:13:23.237 "00000000-0000-0000-0000-000000000001" 00:13:23.237 ], 00:13:23.237 "product_name": "passthru", 00:13:23.237 "block_size": 512, 00:13:23.237 "num_blocks": 65536, 00:13:23.237 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:23.237 "assigned_rate_limits": { 00:13:23.237 "rw_ios_per_sec": 0, 00:13:23.237 "rw_mbytes_per_sec": 0, 00:13:23.237 "r_mbytes_per_sec": 0, 00:13:23.237 "w_mbytes_per_sec": 0 00:13:23.237 }, 00:13:23.237 "claimed": true, 00:13:23.237 "claim_type": "exclusive_write", 00:13:23.237 "zoned": false, 00:13:23.237 "supported_io_types": { 00:13:23.237 "read": true, 00:13:23.237 "write": true, 00:13:23.237 "unmap": true, 00:13:23.237 "flush": true, 00:13:23.237 "reset": true, 00:13:23.237 "nvme_admin": false, 00:13:23.237 "nvme_io": false, 00:13:23.237 "nvme_io_md": false, 00:13:23.237 "write_zeroes": true, 00:13:23.237 "zcopy": true, 00:13:23.237 "get_zone_info": false, 00:13:23.237 "zone_management": false, 00:13:23.237 "zone_append": false, 00:13:23.237 "compare": false, 00:13:23.237 "compare_and_write": false, 00:13:23.237 "abort": true, 00:13:23.237 "seek_hole": false, 00:13:23.237 "seek_data": false, 00:13:23.237 "copy": true, 00:13:23.237 "nvme_iov_md": false 00:13:23.237 }, 00:13:23.237 "memory_domains": [ 00:13:23.237 { 00:13:23.237 "dma_device_id": "system", 00:13:23.238 "dma_device_type": 1 00:13:23.238 }, 00:13:23.238 { 00:13:23.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.238 "dma_device_type": 2 00:13:23.238 } 00:13:23.238 ], 00:13:23.238 "driver_specific": { 00:13:23.238 "passthru": { 00:13:23.238 "name": "pt1", 00:13:23.238 "base_bdev_name": "malloc1" 00:13:23.238 } 00:13:23.238 } 00:13:23.238 }' 00:13:23.238 07:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.509 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.770 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:23.770 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:23.770 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:23.770 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:23.770 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:23.770 "name": "pt2", 00:13:23.770 "aliases": [ 00:13:23.770 "00000000-0000-0000-0000-000000000002" 00:13:23.770 ], 00:13:23.770 "product_name": "passthru", 00:13:23.770 "block_size": 512, 00:13:23.770 "num_blocks": 65536, 00:13:23.770 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:23.770 "assigned_rate_limits": { 00:13:23.770 "rw_ios_per_sec": 0, 00:13:23.770 "rw_mbytes_per_sec": 0, 00:13:23.770 "r_mbytes_per_sec": 0, 00:13:23.770 "w_mbytes_per_sec": 0 00:13:23.770 }, 00:13:23.770 "claimed": true, 00:13:23.770 "claim_type": "exclusive_write", 00:13:23.770 "zoned": false, 00:13:23.770 "supported_io_types": { 00:13:23.770 "read": true, 00:13:23.770 "write": true, 00:13:23.770 "unmap": true, 00:13:23.770 "flush": true, 00:13:23.770 "reset": true, 00:13:23.770 "nvme_admin": false, 00:13:23.770 "nvme_io": false, 00:13:23.770 "nvme_io_md": false, 00:13:23.770 "write_zeroes": true, 00:13:23.770 "zcopy": true, 00:13:23.770 "get_zone_info": false, 00:13:23.770 "zone_management": false, 00:13:23.770 "zone_append": false, 00:13:23.770 "compare": false, 00:13:23.770 "compare_and_write": false, 00:13:23.771 "abort": true, 00:13:23.771 "seek_hole": false, 00:13:23.771 "seek_data": false, 00:13:23.771 "copy": true, 00:13:23.771 "nvme_iov_md": false 00:13:23.771 }, 00:13:23.771 "memory_domains": [ 00:13:23.771 { 00:13:23.771 "dma_device_id": "system", 00:13:23.771 "dma_device_type": 1 00:13:23.771 }, 00:13:23.771 { 00:13:23.771 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.771 "dma_device_type": 2 00:13:23.771 } 00:13:23.771 ], 00:13:23.771 "driver_specific": { 00:13:23.771 "passthru": { 00:13:23.771 "name": "pt2", 00:13:23.771 "base_bdev_name": "malloc2" 00:13:23.771 } 00:13:23.771 } 00:13:23.771 }' 00:13:23.771 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.771 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.030 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.289 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:24.290 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:24.290 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:24.290 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:24.290 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:24.290 "name": "pt3", 00:13:24.290 "aliases": [ 00:13:24.290 "00000000-0000-0000-0000-000000000003" 00:13:24.290 ], 00:13:24.290 "product_name": "passthru", 00:13:24.290 "block_size": 512, 00:13:24.290 "num_blocks": 65536, 00:13:24.290 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:24.290 "assigned_rate_limits": { 00:13:24.290 "rw_ios_per_sec": 0, 00:13:24.290 "rw_mbytes_per_sec": 0, 00:13:24.290 "r_mbytes_per_sec": 0, 00:13:24.290 "w_mbytes_per_sec": 0 00:13:24.290 }, 00:13:24.290 "claimed": true, 00:13:24.290 "claim_type": "exclusive_write", 00:13:24.290 "zoned": false, 00:13:24.290 "supported_io_types": { 00:13:24.290 "read": true, 00:13:24.290 "write": true, 00:13:24.290 "unmap": true, 00:13:24.290 "flush": true, 00:13:24.290 "reset": true, 00:13:24.290 "nvme_admin": false, 00:13:24.290 "nvme_io": false, 00:13:24.290 "nvme_io_md": false, 00:13:24.290 "write_zeroes": true, 00:13:24.290 "zcopy": true, 00:13:24.290 "get_zone_info": false, 00:13:24.290 "zone_management": false, 00:13:24.290 "zone_append": false, 00:13:24.290 "compare": false, 00:13:24.290 "compare_and_write": false, 00:13:24.290 "abort": true, 00:13:24.290 "seek_hole": false, 00:13:24.290 "seek_data": false, 00:13:24.290 "copy": true, 00:13:24.290 "nvme_iov_md": false 00:13:24.290 }, 00:13:24.290 "memory_domains": [ 00:13:24.290 { 00:13:24.290 "dma_device_id": "system", 00:13:24.290 "dma_device_type": 1 00:13:24.290 }, 00:13:24.290 { 00:13:24.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.290 "dma_device_type": 2 00:13:24.290 } 00:13:24.290 ], 00:13:24.290 "driver_specific": { 00:13:24.290 "passthru": { 00:13:24.290 "name": "pt3", 00:13:24.290 "base_bdev_name": "malloc3" 00:13:24.290 } 00:13:24.290 } 00:13:24.290 }' 00:13:24.290 07:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.290 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.549 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.549 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.549 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.549 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.550 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.550 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.550 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:24.550 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.550 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.809 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:24.809 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:24.809 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:24.809 [2024-07-15 07:49:09.510341] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:24.809 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=996e4a01-f36f-4ee9-bdc7-0fbe5586106b 00:13:24.809 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 996e4a01-f36f-4ee9-bdc7-0fbe5586106b ']' 00:13:24.809 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:25.069 [2024-07-15 07:49:09.706613] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:25.069 [2024-07-15 07:49:09.706624] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:25.069 [2024-07-15 07:49:09.706659] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:25.069 [2024-07-15 07:49:09.706695] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:25.069 [2024-07-15 07:49:09.706701] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2305a90 name raid_bdev1, state offline 00:13:25.069 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.069 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:25.329 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:25.329 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:25.329 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:25.329 07:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:25.588 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:25.588 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:25.588 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:25.588 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:25.848 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:25.848 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:26.108 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:26.368 [2024-07-15 07:49:10.869524] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:26.368 [2024-07-15 07:49:10.870587] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:26.368 [2024-07-15 07:49:10.870619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:26.368 [2024-07-15 07:49:10.870650] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:26.368 [2024-07-15 07:49:10.870678] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:26.368 [2024-07-15 07:49:10.870692] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:26.368 [2024-07-15 07:49:10.870702] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:26.368 [2024-07-15 07:49:10.870708] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2301bf0 name raid_bdev1, state configuring 00:13:26.368 request: 00:13:26.368 { 00:13:26.368 "name": "raid_bdev1", 00:13:26.368 "raid_level": "raid0", 00:13:26.368 "base_bdevs": [ 00:13:26.368 "malloc1", 00:13:26.368 "malloc2", 00:13:26.368 "malloc3" 00:13:26.368 ], 00:13:26.368 "strip_size_kb": 64, 00:13:26.368 "superblock": false, 00:13:26.368 "method": "bdev_raid_create", 00:13:26.368 "req_id": 1 00:13:26.368 } 00:13:26.368 Got JSON-RPC error response 00:13:26.368 response: 00:13:26.368 { 00:13:26.368 "code": -17, 00:13:26.368 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:26.368 } 00:13:26.368 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:26.368 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:26.368 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:26.368 07:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:26.368 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.368 07:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:26.368 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:26.368 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:26.368 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:26.629 [2024-07-15 07:49:11.250429] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:26.629 [2024-07-15 07:49:11.250451] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:26.629 [2024-07-15 07:49:11.250463] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2159e00 00:13:26.629 [2024-07-15 07:49:11.250469] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:26.629 [2024-07-15 07:49:11.251692] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:26.629 [2024-07-15 07:49:11.251718] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:26.629 [2024-07-15 07:49:11.251761] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:26.629 [2024-07-15 07:49:11.251778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:26.629 pt1 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.629 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:26.889 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.889 "name": "raid_bdev1", 00:13:26.889 "uuid": "996e4a01-f36f-4ee9-bdc7-0fbe5586106b", 00:13:26.889 "strip_size_kb": 64, 00:13:26.889 "state": "configuring", 00:13:26.889 "raid_level": "raid0", 00:13:26.889 "superblock": true, 00:13:26.889 "num_base_bdevs": 3, 00:13:26.889 "num_base_bdevs_discovered": 1, 00:13:26.889 "num_base_bdevs_operational": 3, 00:13:26.889 "base_bdevs_list": [ 00:13:26.889 { 00:13:26.889 "name": "pt1", 00:13:26.889 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:26.889 "is_configured": true, 00:13:26.889 "data_offset": 2048, 00:13:26.889 "data_size": 63488 00:13:26.889 }, 00:13:26.889 { 00:13:26.889 "name": null, 00:13:26.889 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:26.889 "is_configured": false, 00:13:26.889 "data_offset": 2048, 00:13:26.889 "data_size": 63488 00:13:26.889 }, 00:13:26.889 { 00:13:26.889 "name": null, 00:13:26.889 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:26.889 "is_configured": false, 00:13:26.889 "data_offset": 2048, 00:13:26.889 "data_size": 63488 00:13:26.889 } 00:13:26.889 ] 00:13:26.889 }' 00:13:26.889 07:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.889 07:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.459 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:27.459 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:27.459 [2024-07-15 07:49:12.184805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:27.459 [2024-07-15 07:49:12.184834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:27.459 [2024-07-15 07:49:12.184845] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2158c50 00:13:27.459 [2024-07-15 07:49:12.184851] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:27.459 [2024-07-15 07:49:12.185103] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:27.459 [2024-07-15 07:49:12.185115] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:27.459 [2024-07-15 07:49:12.185154] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:27.459 [2024-07-15 07:49:12.185167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:27.459 pt2 00:13:27.459 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:27.720 [2024-07-15 07:49:12.381307] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.720 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:27.980 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.980 "name": "raid_bdev1", 00:13:27.980 "uuid": "996e4a01-f36f-4ee9-bdc7-0fbe5586106b", 00:13:27.980 "strip_size_kb": 64, 00:13:27.980 "state": "configuring", 00:13:27.980 "raid_level": "raid0", 00:13:27.980 "superblock": true, 00:13:27.980 "num_base_bdevs": 3, 00:13:27.980 "num_base_bdevs_discovered": 1, 00:13:27.980 "num_base_bdevs_operational": 3, 00:13:27.980 "base_bdevs_list": [ 00:13:27.980 { 00:13:27.980 "name": "pt1", 00:13:27.980 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:27.980 "is_configured": true, 00:13:27.980 "data_offset": 2048, 00:13:27.980 "data_size": 63488 00:13:27.980 }, 00:13:27.980 { 00:13:27.980 "name": null, 00:13:27.980 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:27.980 "is_configured": false, 00:13:27.980 "data_offset": 2048, 00:13:27.980 "data_size": 63488 00:13:27.980 }, 00:13:27.980 { 00:13:27.980 "name": null, 00:13:27.980 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:27.980 "is_configured": false, 00:13:27.980 "data_offset": 2048, 00:13:27.980 "data_size": 63488 00:13:27.980 } 00:13:27.980 ] 00:13:27.980 }' 00:13:27.980 07:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.980 07:49:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.551 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:28.551 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:28.551 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:28.551 [2024-07-15 07:49:13.303649] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:28.551 [2024-07-15 07:49:13.303685] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.551 [2024-07-15 07:49:13.303696] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23043f0 00:13:28.551 [2024-07-15 07:49:13.303702] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.551 [2024-07-15 07:49:13.303981] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.551 [2024-07-15 07:49:13.303993] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:28.551 [2024-07-15 07:49:13.304037] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:28.551 [2024-07-15 07:49:13.304050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:28.811 pt2 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:28.811 [2024-07-15 07:49:13.496134] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:28.811 [2024-07-15 07:49:13.496156] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.811 [2024-07-15 07:49:13.496164] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2302aa0 00:13:28.811 [2024-07-15 07:49:13.496170] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.811 [2024-07-15 07:49:13.496404] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.811 [2024-07-15 07:49:13.496415] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:28.811 [2024-07-15 07:49:13.496450] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:28.811 [2024-07-15 07:49:13.496462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:28.811 [2024-07-15 07:49:13.496541] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2303100 00:13:28.811 [2024-07-15 07:49:13.496547] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:28.811 [2024-07-15 07:49:13.496676] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d4280 00:13:28.811 [2024-07-15 07:49:13.496784] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2303100 00:13:28.811 [2024-07-15 07:49:13.496790] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2303100 00:13:28.811 [2024-07-15 07:49:13.496860] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:28.811 pt3 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:28.811 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.071 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.071 "name": "raid_bdev1", 00:13:29.071 "uuid": "996e4a01-f36f-4ee9-bdc7-0fbe5586106b", 00:13:29.071 "strip_size_kb": 64, 00:13:29.071 "state": "online", 00:13:29.071 "raid_level": "raid0", 00:13:29.071 "superblock": true, 00:13:29.071 "num_base_bdevs": 3, 00:13:29.071 "num_base_bdevs_discovered": 3, 00:13:29.071 "num_base_bdevs_operational": 3, 00:13:29.071 "base_bdevs_list": [ 00:13:29.071 { 00:13:29.071 "name": "pt1", 00:13:29.071 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:29.071 "is_configured": true, 00:13:29.071 "data_offset": 2048, 00:13:29.071 "data_size": 63488 00:13:29.071 }, 00:13:29.071 { 00:13:29.071 "name": "pt2", 00:13:29.071 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:29.071 "is_configured": true, 00:13:29.071 "data_offset": 2048, 00:13:29.071 "data_size": 63488 00:13:29.071 }, 00:13:29.071 { 00:13:29.071 "name": "pt3", 00:13:29.071 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:29.071 "is_configured": true, 00:13:29.071 "data_offset": 2048, 00:13:29.071 "data_size": 63488 00:13:29.071 } 00:13:29.071 ] 00:13:29.071 }' 00:13:29.071 07:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.071 07:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.641 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:29.641 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:29.641 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:29.641 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:29.641 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:29.641 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:29.641 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:29.641 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:29.641 [2024-07-15 07:49:14.382614] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:29.902 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:29.902 "name": "raid_bdev1", 00:13:29.902 "aliases": [ 00:13:29.902 "996e4a01-f36f-4ee9-bdc7-0fbe5586106b" 00:13:29.902 ], 00:13:29.902 "product_name": "Raid Volume", 00:13:29.902 "block_size": 512, 00:13:29.902 "num_blocks": 190464, 00:13:29.902 "uuid": "996e4a01-f36f-4ee9-bdc7-0fbe5586106b", 00:13:29.902 "assigned_rate_limits": { 00:13:29.902 "rw_ios_per_sec": 0, 00:13:29.902 "rw_mbytes_per_sec": 0, 00:13:29.902 "r_mbytes_per_sec": 0, 00:13:29.902 "w_mbytes_per_sec": 0 00:13:29.902 }, 00:13:29.902 "claimed": false, 00:13:29.902 "zoned": false, 00:13:29.902 "supported_io_types": { 00:13:29.902 "read": true, 00:13:29.902 "write": true, 00:13:29.902 "unmap": true, 00:13:29.902 "flush": true, 00:13:29.902 "reset": true, 00:13:29.902 "nvme_admin": false, 00:13:29.902 "nvme_io": false, 00:13:29.902 "nvme_io_md": false, 00:13:29.902 "write_zeroes": true, 00:13:29.902 "zcopy": false, 00:13:29.902 "get_zone_info": false, 00:13:29.902 "zone_management": false, 00:13:29.902 "zone_append": false, 00:13:29.902 "compare": false, 00:13:29.902 "compare_and_write": false, 00:13:29.902 "abort": false, 00:13:29.902 "seek_hole": false, 00:13:29.902 "seek_data": false, 00:13:29.902 "copy": false, 00:13:29.902 "nvme_iov_md": false 00:13:29.902 }, 00:13:29.902 "memory_domains": [ 00:13:29.902 { 00:13:29.902 "dma_device_id": "system", 00:13:29.902 "dma_device_type": 1 00:13:29.902 }, 00:13:29.902 { 00:13:29.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.902 "dma_device_type": 2 00:13:29.902 }, 00:13:29.902 { 00:13:29.902 "dma_device_id": "system", 00:13:29.902 "dma_device_type": 1 00:13:29.902 }, 00:13:29.902 { 00:13:29.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.902 "dma_device_type": 2 00:13:29.902 }, 00:13:29.902 { 00:13:29.902 "dma_device_id": "system", 00:13:29.902 "dma_device_type": 1 00:13:29.902 }, 00:13:29.902 { 00:13:29.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.902 "dma_device_type": 2 00:13:29.902 } 00:13:29.902 ], 00:13:29.902 "driver_specific": { 00:13:29.902 "raid": { 00:13:29.902 "uuid": "996e4a01-f36f-4ee9-bdc7-0fbe5586106b", 00:13:29.902 "strip_size_kb": 64, 00:13:29.902 "state": "online", 00:13:29.902 "raid_level": "raid0", 00:13:29.902 "superblock": true, 00:13:29.902 "num_base_bdevs": 3, 00:13:29.902 "num_base_bdevs_discovered": 3, 00:13:29.902 "num_base_bdevs_operational": 3, 00:13:29.902 "base_bdevs_list": [ 00:13:29.902 { 00:13:29.902 "name": "pt1", 00:13:29.902 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:29.902 "is_configured": true, 00:13:29.902 "data_offset": 2048, 00:13:29.902 "data_size": 63488 00:13:29.902 }, 00:13:29.902 { 00:13:29.902 "name": "pt2", 00:13:29.902 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:29.902 "is_configured": true, 00:13:29.902 "data_offset": 2048, 00:13:29.902 "data_size": 63488 00:13:29.902 }, 00:13:29.902 { 00:13:29.902 "name": "pt3", 00:13:29.902 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:29.902 "is_configured": true, 00:13:29.902 "data_offset": 2048, 00:13:29.902 "data_size": 63488 00:13:29.902 } 00:13:29.902 ] 00:13:29.902 } 00:13:29.902 } 00:13:29.902 }' 00:13:29.902 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:29.902 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:29.902 pt2 00:13:29.902 pt3' 00:13:29.902 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:29.902 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:29.902 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:29.902 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:29.902 "name": "pt1", 00:13:29.902 "aliases": [ 00:13:29.902 "00000000-0000-0000-0000-000000000001" 00:13:29.902 ], 00:13:29.902 "product_name": "passthru", 00:13:29.902 "block_size": 512, 00:13:29.902 "num_blocks": 65536, 00:13:29.902 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:29.902 "assigned_rate_limits": { 00:13:29.902 "rw_ios_per_sec": 0, 00:13:29.902 "rw_mbytes_per_sec": 0, 00:13:29.902 "r_mbytes_per_sec": 0, 00:13:29.902 "w_mbytes_per_sec": 0 00:13:29.902 }, 00:13:29.902 "claimed": true, 00:13:29.902 "claim_type": "exclusive_write", 00:13:29.902 "zoned": false, 00:13:29.902 "supported_io_types": { 00:13:29.902 "read": true, 00:13:29.902 "write": true, 00:13:29.902 "unmap": true, 00:13:29.902 "flush": true, 00:13:29.902 "reset": true, 00:13:29.902 "nvme_admin": false, 00:13:29.902 "nvme_io": false, 00:13:29.902 "nvme_io_md": false, 00:13:29.902 "write_zeroes": true, 00:13:29.902 "zcopy": true, 00:13:29.902 "get_zone_info": false, 00:13:29.902 "zone_management": false, 00:13:29.902 "zone_append": false, 00:13:29.902 "compare": false, 00:13:29.902 "compare_and_write": false, 00:13:29.902 "abort": true, 00:13:29.902 "seek_hole": false, 00:13:29.902 "seek_data": false, 00:13:29.902 "copy": true, 00:13:29.902 "nvme_iov_md": false 00:13:29.902 }, 00:13:29.902 "memory_domains": [ 00:13:29.902 { 00:13:29.902 "dma_device_id": "system", 00:13:29.902 "dma_device_type": 1 00:13:29.902 }, 00:13:29.902 { 00:13:29.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.902 "dma_device_type": 2 00:13:29.902 } 00:13:29.902 ], 00:13:29.902 "driver_specific": { 00:13:29.902 "passthru": { 00:13:29.902 "name": "pt1", 00:13:29.902 "base_bdev_name": "malloc1" 00:13:29.902 } 00:13:29.902 } 00:13:29.902 }' 00:13:29.902 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:30.163 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.422 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.422 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:30.422 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.422 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:30.422 07:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:30.422 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:30.422 "name": "pt2", 00:13:30.422 "aliases": [ 00:13:30.422 "00000000-0000-0000-0000-000000000002" 00:13:30.422 ], 00:13:30.422 "product_name": "passthru", 00:13:30.422 "block_size": 512, 00:13:30.422 "num_blocks": 65536, 00:13:30.422 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:30.422 "assigned_rate_limits": { 00:13:30.422 "rw_ios_per_sec": 0, 00:13:30.422 "rw_mbytes_per_sec": 0, 00:13:30.422 "r_mbytes_per_sec": 0, 00:13:30.422 "w_mbytes_per_sec": 0 00:13:30.422 }, 00:13:30.422 "claimed": true, 00:13:30.422 "claim_type": "exclusive_write", 00:13:30.422 "zoned": false, 00:13:30.422 "supported_io_types": { 00:13:30.422 "read": true, 00:13:30.422 "write": true, 00:13:30.422 "unmap": true, 00:13:30.422 "flush": true, 00:13:30.422 "reset": true, 00:13:30.422 "nvme_admin": false, 00:13:30.422 "nvme_io": false, 00:13:30.422 "nvme_io_md": false, 00:13:30.422 "write_zeroes": true, 00:13:30.422 "zcopy": true, 00:13:30.422 "get_zone_info": false, 00:13:30.422 "zone_management": false, 00:13:30.422 "zone_append": false, 00:13:30.422 "compare": false, 00:13:30.422 "compare_and_write": false, 00:13:30.422 "abort": true, 00:13:30.422 "seek_hole": false, 00:13:30.422 "seek_data": false, 00:13:30.422 "copy": true, 00:13:30.422 "nvme_iov_md": false 00:13:30.422 }, 00:13:30.422 "memory_domains": [ 00:13:30.422 { 00:13:30.422 "dma_device_id": "system", 00:13:30.422 "dma_device_type": 1 00:13:30.422 }, 00:13:30.422 { 00:13:30.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.422 "dma_device_type": 2 00:13:30.422 } 00:13:30.422 ], 00:13:30.422 "driver_specific": { 00:13:30.422 "passthru": { 00:13:30.422 "name": "pt2", 00:13:30.422 "base_bdev_name": "malloc2" 00:13:30.422 } 00:13:30.422 } 00:13:30.422 }' 00:13:30.422 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:30.681 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.942 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.942 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:30.942 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.942 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:30.942 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:30.942 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:30.942 "name": "pt3", 00:13:30.942 "aliases": [ 00:13:30.942 "00000000-0000-0000-0000-000000000003" 00:13:30.942 ], 00:13:30.942 "product_name": "passthru", 00:13:30.942 "block_size": 512, 00:13:30.942 "num_blocks": 65536, 00:13:30.942 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:30.942 "assigned_rate_limits": { 00:13:30.942 "rw_ios_per_sec": 0, 00:13:30.942 "rw_mbytes_per_sec": 0, 00:13:30.942 "r_mbytes_per_sec": 0, 00:13:30.942 "w_mbytes_per_sec": 0 00:13:30.942 }, 00:13:30.942 "claimed": true, 00:13:30.942 "claim_type": "exclusive_write", 00:13:30.942 "zoned": false, 00:13:30.942 "supported_io_types": { 00:13:30.942 "read": true, 00:13:30.942 "write": true, 00:13:30.942 "unmap": true, 00:13:30.942 "flush": true, 00:13:30.942 "reset": true, 00:13:30.942 "nvme_admin": false, 00:13:30.942 "nvme_io": false, 00:13:30.942 "nvme_io_md": false, 00:13:30.942 "write_zeroes": true, 00:13:30.942 "zcopy": true, 00:13:30.942 "get_zone_info": false, 00:13:30.942 "zone_management": false, 00:13:30.942 "zone_append": false, 00:13:30.942 "compare": false, 00:13:30.942 "compare_and_write": false, 00:13:30.942 "abort": true, 00:13:30.942 "seek_hole": false, 00:13:30.942 "seek_data": false, 00:13:30.942 "copy": true, 00:13:30.942 "nvme_iov_md": false 00:13:30.942 }, 00:13:30.942 "memory_domains": [ 00:13:30.942 { 00:13:30.942 "dma_device_id": "system", 00:13:30.942 "dma_device_type": 1 00:13:30.942 }, 00:13:30.942 { 00:13:30.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.942 "dma_device_type": 2 00:13:30.942 } 00:13:30.942 ], 00:13:30.942 "driver_specific": { 00:13:30.942 "passthru": { 00:13:30.942 "name": "pt3", 00:13:30.942 "base_bdev_name": "malloc3" 00:13:30.942 } 00:13:30.942 } 00:13:30.942 }' 00:13:30.942 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.942 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.203 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.463 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:31.463 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:31.463 07:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:31.463 [2024-07-15 07:49:16.175145] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 996e4a01-f36f-4ee9-bdc7-0fbe5586106b '!=' 996e4a01-f36f-4ee9-bdc7-0fbe5586106b ']' 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1618884 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1618884 ']' 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1618884 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:31.463 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1618884 00:13:31.724 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:31.724 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:31.724 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1618884' 00:13:31.724 killing process with pid 1618884 00:13:31.724 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1618884 00:13:31.724 [2024-07-15 07:49:16.247668] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:31.724 [2024-07-15 07:49:16.247717] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:31.724 [2024-07-15 07:49:16.247757] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:31.724 [2024-07-15 07:49:16.247763] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2303100 name raid_bdev1, state offline 00:13:31.724 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1618884 00:13:31.724 [2024-07-15 07:49:16.262588] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:31.724 07:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:31.724 00:13:31.724 real 0m11.798s 00:13:31.724 user 0m21.700s 00:13:31.724 sys 0m1.756s 00:13:31.724 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:31.724 07:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.724 ************************************ 00:13:31.724 END TEST raid_superblock_test 00:13:31.724 ************************************ 00:13:31.724 07:49:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:31.724 07:49:16 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:13:31.724 07:49:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:31.724 07:49:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:31.724 07:49:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:31.724 ************************************ 00:13:31.724 START TEST raid_read_error_test 00:13:31.724 ************************************ 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:31.724 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.mO96whP89O 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1621192 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1621192 /var/tmp/spdk-raid.sock 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1621192 ']' 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:31.725 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:31.725 07:49:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.985 [2024-07-15 07:49:16.528931] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:13:31.985 [2024-07-15 07:49:16.528984] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1621192 ] 00:13:31.985 [2024-07-15 07:49:16.618624] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:31.985 [2024-07-15 07:49:16.685637] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:31.985 [2024-07-15 07:49:16.724862] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:31.985 [2024-07-15 07:49:16.724886] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:32.926 07:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:32.926 07:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:32.926 07:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:32.926 07:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:32.926 BaseBdev1_malloc 00:13:32.926 07:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:33.186 true 00:13:33.186 07:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:33.186 [2024-07-15 07:49:17.923395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:33.186 [2024-07-15 07:49:17.923425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:33.186 [2024-07-15 07:49:17.923437] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246ab50 00:13:33.186 [2024-07-15 07:49:17.923448] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:33.186 [2024-07-15 07:49:17.924771] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:33.186 [2024-07-15 07:49:17.924791] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:33.186 BaseBdev1 00:13:33.186 07:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:33.186 07:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:33.446 BaseBdev2_malloc 00:13:33.446 07:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:33.705 true 00:13:33.705 07:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:33.965 [2024-07-15 07:49:18.478664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:33.965 [2024-07-15 07:49:18.478693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:33.965 [2024-07-15 07:49:18.478704] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x244eea0 00:13:33.965 [2024-07-15 07:49:18.478715] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:33.965 [2024-07-15 07:49:18.479900] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:33.965 [2024-07-15 07:49:18.479919] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:33.965 BaseBdev2 00:13:33.965 07:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:33.965 07:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:33.965 BaseBdev3_malloc 00:13:33.965 07:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:34.224 true 00:13:34.224 07:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:34.509 [2024-07-15 07:49:19.033909] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:34.510 [2024-07-15 07:49:19.033938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:34.510 [2024-07-15 07:49:19.033951] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2452fb0 00:13:34.510 [2024-07-15 07:49:19.033958] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:34.510 [2024-07-15 07:49:19.035141] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:34.510 [2024-07-15 07:49:19.035161] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:34.510 BaseBdev3 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:34.510 [2024-07-15 07:49:19.226416] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:34.510 [2024-07-15 07:49:19.227423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:34.510 [2024-07-15 07:49:19.227474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:34.510 [2024-07-15 07:49:19.227627] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24540e0 00:13:34.510 [2024-07-15 07:49:19.227635] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:34.510 [2024-07-15 07:49:19.227787] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b6210 00:13:34.510 [2024-07-15 07:49:19.227901] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24540e0 00:13:34.510 [2024-07-15 07:49:19.227910] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24540e0 00:13:34.510 [2024-07-15 07:49:19.227984] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:34.510 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.771 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.771 "name": "raid_bdev1", 00:13:34.771 "uuid": "eacf45c6-f01b-4fbb-b290-a0cd838f6c77", 00:13:34.771 "strip_size_kb": 64, 00:13:34.771 "state": "online", 00:13:34.771 "raid_level": "raid0", 00:13:34.771 "superblock": true, 00:13:34.771 "num_base_bdevs": 3, 00:13:34.771 "num_base_bdevs_discovered": 3, 00:13:34.771 "num_base_bdevs_operational": 3, 00:13:34.771 "base_bdevs_list": [ 00:13:34.772 { 00:13:34.772 "name": "BaseBdev1", 00:13:34.772 "uuid": "93f640a5-3bf2-5fd2-941a-4d5e08ad162f", 00:13:34.772 "is_configured": true, 00:13:34.772 "data_offset": 2048, 00:13:34.772 "data_size": 63488 00:13:34.772 }, 00:13:34.772 { 00:13:34.772 "name": "BaseBdev2", 00:13:34.772 "uuid": "2441d917-dabe-563e-85c0-28f2546331a9", 00:13:34.772 "is_configured": true, 00:13:34.772 "data_offset": 2048, 00:13:34.772 "data_size": 63488 00:13:34.772 }, 00:13:34.772 { 00:13:34.772 "name": "BaseBdev3", 00:13:34.772 "uuid": "ce4ad3d8-6dd1-5b25-9767-e7308cc96e1c", 00:13:34.772 "is_configured": true, 00:13:34.772 "data_offset": 2048, 00:13:34.772 "data_size": 63488 00:13:34.772 } 00:13:34.772 ] 00:13:34.772 }' 00:13:34.772 07:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.772 07:49:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.341 07:49:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:35.341 07:49:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:35.602 [2024-07-15 07:49:20.100846] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2453da0 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.541 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:36.801 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.801 "name": "raid_bdev1", 00:13:36.801 "uuid": "eacf45c6-f01b-4fbb-b290-a0cd838f6c77", 00:13:36.801 "strip_size_kb": 64, 00:13:36.801 "state": "online", 00:13:36.801 "raid_level": "raid0", 00:13:36.801 "superblock": true, 00:13:36.801 "num_base_bdevs": 3, 00:13:36.801 "num_base_bdevs_discovered": 3, 00:13:36.801 "num_base_bdevs_operational": 3, 00:13:36.801 "base_bdevs_list": [ 00:13:36.801 { 00:13:36.801 "name": "BaseBdev1", 00:13:36.801 "uuid": "93f640a5-3bf2-5fd2-941a-4d5e08ad162f", 00:13:36.801 "is_configured": true, 00:13:36.801 "data_offset": 2048, 00:13:36.801 "data_size": 63488 00:13:36.801 }, 00:13:36.801 { 00:13:36.801 "name": "BaseBdev2", 00:13:36.801 "uuid": "2441d917-dabe-563e-85c0-28f2546331a9", 00:13:36.801 "is_configured": true, 00:13:36.801 "data_offset": 2048, 00:13:36.801 "data_size": 63488 00:13:36.801 }, 00:13:36.801 { 00:13:36.801 "name": "BaseBdev3", 00:13:36.801 "uuid": "ce4ad3d8-6dd1-5b25-9767-e7308cc96e1c", 00:13:36.801 "is_configured": true, 00:13:36.801 "data_offset": 2048, 00:13:36.801 "data_size": 63488 00:13:36.801 } 00:13:36.801 ] 00:13:36.801 }' 00:13:36.801 07:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.801 07:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.370 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:37.630 [2024-07-15 07:49:22.179212] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:37.630 [2024-07-15 07:49:22.179238] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:37.630 [2024-07-15 07:49:22.181824] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:37.630 [2024-07-15 07:49:22.181851] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:37.630 [2024-07-15 07:49:22.181875] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:37.630 [2024-07-15 07:49:22.181880] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24540e0 name raid_bdev1, state offline 00:13:37.630 0 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1621192 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1621192 ']' 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1621192 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1621192 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1621192' 00:13:37.630 killing process with pid 1621192 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1621192 00:13:37.630 [2024-07-15 07:49:22.249064] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1621192 00:13:37.630 [2024-07-15 07:49:22.260167] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.mO96whP89O 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:37.630 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:37.890 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.48 00:13:37.890 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:37.890 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:37.890 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:37.890 07:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.48 != \0\.\0\0 ]] 00:13:37.890 00:13:37.890 real 0m5.933s 00:13:37.890 user 0m9.470s 00:13:37.890 sys 0m0.828s 00:13:37.890 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:37.890 07:49:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.890 ************************************ 00:13:37.890 END TEST raid_read_error_test 00:13:37.890 ************************************ 00:13:37.890 07:49:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:37.890 07:49:22 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:13:37.890 07:49:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:37.890 07:49:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:37.890 07:49:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:37.890 ************************************ 00:13:37.890 START TEST raid_write_error_test 00:13:37.890 ************************************ 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.TtU4UUvHOA 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1622234 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1622234 /var/tmp/spdk-raid.sock 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1622234 ']' 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:37.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:37.890 07:49:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.890 [2024-07-15 07:49:22.548886] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:13:37.890 [2024-07-15 07:49:22.548950] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1622234 ] 00:13:37.890 [2024-07-15 07:49:22.636430] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:38.150 [2024-07-15 07:49:22.700344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:38.150 [2024-07-15 07:49:22.742603] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.150 [2024-07-15 07:49:22.742628] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.719 07:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:38.719 07:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:38.719 07:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:38.719 07:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:38.978 BaseBdev1_malloc 00:13:38.978 07:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:38.978 true 00:13:38.978 07:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:39.237 [2024-07-15 07:49:23.881471] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:39.237 [2024-07-15 07:49:23.881499] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.237 [2024-07-15 07:49:23.881510] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef8b50 00:13:39.237 [2024-07-15 07:49:23.881516] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.237 [2024-07-15 07:49:23.882814] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.237 [2024-07-15 07:49:23.882834] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:39.237 BaseBdev1 00:13:39.237 07:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:39.237 07:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:39.497 BaseBdev2_malloc 00:13:39.497 07:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:39.497 true 00:13:39.497 07:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:39.756 [2024-07-15 07:49:24.392672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:39.756 [2024-07-15 07:49:24.392699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.756 [2024-07-15 07:49:24.392714] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xedcea0 00:13:39.756 [2024-07-15 07:49:24.392721] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.756 [2024-07-15 07:49:24.393873] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.756 [2024-07-15 07:49:24.393892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:39.756 BaseBdev2 00:13:39.756 07:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:39.757 07:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:40.016 BaseBdev3_malloc 00:13:40.016 07:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:40.276 true 00:13:40.276 07:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:40.276 [2024-07-15 07:49:24.968014] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:40.276 [2024-07-15 07:49:24.968041] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:40.276 [2024-07-15 07:49:24.968054] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xee0fb0 00:13:40.276 [2024-07-15 07:49:24.968060] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:40.276 [2024-07-15 07:49:24.969236] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:40.276 [2024-07-15 07:49:24.969255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:40.276 BaseBdev3 00:13:40.276 07:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:40.536 [2024-07-15 07:49:25.156619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:40.536 [2024-07-15 07:49:25.157636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:40.536 [2024-07-15 07:49:25.157687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:40.536 [2024-07-15 07:49:25.157843] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xee20e0 00:13:40.536 [2024-07-15 07:49:25.157850] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:40.536 [2024-07-15 07:49:25.157995] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd44210 00:13:40.536 [2024-07-15 07:49:25.158109] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xee20e0 00:13:40.536 [2024-07-15 07:49:25.158114] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xee20e0 00:13:40.536 [2024-07-15 07:49:25.158189] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.536 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:40.796 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.796 "name": "raid_bdev1", 00:13:40.796 "uuid": "f6b268d3-ec1a-47f5-b824-b5812224ec42", 00:13:40.796 "strip_size_kb": 64, 00:13:40.796 "state": "online", 00:13:40.796 "raid_level": "raid0", 00:13:40.796 "superblock": true, 00:13:40.796 "num_base_bdevs": 3, 00:13:40.796 "num_base_bdevs_discovered": 3, 00:13:40.796 "num_base_bdevs_operational": 3, 00:13:40.796 "base_bdevs_list": [ 00:13:40.796 { 00:13:40.796 "name": "BaseBdev1", 00:13:40.796 "uuid": "cd82e1e7-6392-5090-8ba2-5c12d859b7e5", 00:13:40.796 "is_configured": true, 00:13:40.796 "data_offset": 2048, 00:13:40.796 "data_size": 63488 00:13:40.796 }, 00:13:40.796 { 00:13:40.796 "name": "BaseBdev2", 00:13:40.796 "uuid": "a7561914-b3c0-51ca-bde3-901f032d9d2d", 00:13:40.796 "is_configured": true, 00:13:40.796 "data_offset": 2048, 00:13:40.796 "data_size": 63488 00:13:40.796 }, 00:13:40.796 { 00:13:40.796 "name": "BaseBdev3", 00:13:40.796 "uuid": "d8a7dba6-1c2e-50c7-96b7-9104f92b11a3", 00:13:40.796 "is_configured": true, 00:13:40.796 "data_offset": 2048, 00:13:40.796 "data_size": 63488 00:13:40.796 } 00:13:40.796 ] 00:13:40.796 }' 00:13:40.796 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.796 07:49:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.365 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:41.365 07:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:41.365 [2024-07-15 07:49:26.043064] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee1da0 00:13:42.306 07:49:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.566 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:42.826 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.826 "name": "raid_bdev1", 00:13:42.826 "uuid": "f6b268d3-ec1a-47f5-b824-b5812224ec42", 00:13:42.826 "strip_size_kb": 64, 00:13:42.826 "state": "online", 00:13:42.826 "raid_level": "raid0", 00:13:42.826 "superblock": true, 00:13:42.826 "num_base_bdevs": 3, 00:13:42.826 "num_base_bdevs_discovered": 3, 00:13:42.826 "num_base_bdevs_operational": 3, 00:13:42.826 "base_bdevs_list": [ 00:13:42.826 { 00:13:42.826 "name": "BaseBdev1", 00:13:42.826 "uuid": "cd82e1e7-6392-5090-8ba2-5c12d859b7e5", 00:13:42.826 "is_configured": true, 00:13:42.826 "data_offset": 2048, 00:13:42.826 "data_size": 63488 00:13:42.826 }, 00:13:42.826 { 00:13:42.826 "name": "BaseBdev2", 00:13:42.826 "uuid": "a7561914-b3c0-51ca-bde3-901f032d9d2d", 00:13:42.826 "is_configured": true, 00:13:42.826 "data_offset": 2048, 00:13:42.826 "data_size": 63488 00:13:42.826 }, 00:13:42.826 { 00:13:42.826 "name": "BaseBdev3", 00:13:42.826 "uuid": "d8a7dba6-1c2e-50c7-96b7-9104f92b11a3", 00:13:42.826 "is_configured": true, 00:13:42.826 "data_offset": 2048, 00:13:42.826 "data_size": 63488 00:13:42.826 } 00:13:42.826 ] 00:13:42.826 }' 00:13:42.826 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.826 07:49:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.396 07:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:43.396 [2024-07-15 07:49:28.082099] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:43.396 [2024-07-15 07:49:28.082131] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:43.396 [2024-07-15 07:49:28.084818] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:43.396 [2024-07-15 07:49:28.084843] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:43.396 [2024-07-15 07:49:28.084866] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:43.396 [2024-07-15 07:49:28.084872] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xee20e0 name raid_bdev1, state offline 00:13:43.396 0 00:13:43.396 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1622234 00:13:43.396 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1622234 ']' 00:13:43.396 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1622234 00:13:43.396 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:43.396 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:43.396 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1622234 00:13:43.656 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:43.656 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:43.656 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1622234' 00:13:43.656 killing process with pid 1622234 00:13:43.656 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1622234 00:13:43.656 [2024-07-15 07:49:28.171858] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1622234 00:13:43.657 [2024-07-15 07:49:28.183259] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.TtU4UUvHOA 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:13:43.657 00:13:43.657 real 0m5.848s 00:13:43.657 user 0m9.367s 00:13:43.657 sys 0m0.804s 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:43.657 07:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.657 ************************************ 00:13:43.657 END TEST raid_write_error_test 00:13:43.657 ************************************ 00:13:43.657 07:49:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:43.657 07:49:28 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:43.657 07:49:28 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:13:43.657 07:49:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:43.657 07:49:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:43.657 07:49:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:43.657 ************************************ 00:13:43.657 START TEST raid_state_function_test 00:13:43.657 ************************************ 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1623469 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1623469' 00:13:43.657 Process raid pid: 1623469 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1623469 /var/tmp/spdk-raid.sock 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1623469 ']' 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:43.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:43.657 07:49:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.917 [2024-07-15 07:49:28.453111] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:13:43.917 [2024-07-15 07:49:28.453162] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:43.917 [2024-07-15 07:49:28.544129] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:43.917 [2024-07-15 07:49:28.609721] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:43.917 [2024-07-15 07:49:28.654259] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:43.917 [2024-07-15 07:49:28.654293] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.859 07:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:44.859 07:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:44.859 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:44.859 [2024-07-15 07:49:29.469800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:44.859 [2024-07-15 07:49:29.469831] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:44.859 [2024-07-15 07:49:29.469837] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:44.859 [2024-07-15 07:49:29.469843] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:44.859 [2024-07-15 07:49:29.469848] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:44.859 [2024-07-15 07:49:29.469853] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:44.859 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:44.859 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.860 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.120 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.120 "name": "Existed_Raid", 00:13:45.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.120 "strip_size_kb": 64, 00:13:45.120 "state": "configuring", 00:13:45.120 "raid_level": "concat", 00:13:45.120 "superblock": false, 00:13:45.120 "num_base_bdevs": 3, 00:13:45.120 "num_base_bdevs_discovered": 0, 00:13:45.120 "num_base_bdevs_operational": 3, 00:13:45.120 "base_bdevs_list": [ 00:13:45.120 { 00:13:45.120 "name": "BaseBdev1", 00:13:45.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.120 "is_configured": false, 00:13:45.120 "data_offset": 0, 00:13:45.120 "data_size": 0 00:13:45.120 }, 00:13:45.120 { 00:13:45.120 "name": "BaseBdev2", 00:13:45.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.120 "is_configured": false, 00:13:45.120 "data_offset": 0, 00:13:45.120 "data_size": 0 00:13:45.120 }, 00:13:45.120 { 00:13:45.120 "name": "BaseBdev3", 00:13:45.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.120 "is_configured": false, 00:13:45.120 "data_offset": 0, 00:13:45.120 "data_size": 0 00:13:45.120 } 00:13:45.120 ] 00:13:45.120 }' 00:13:45.120 07:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.120 07:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.691 07:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:45.691 [2024-07-15 07:49:30.404072] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:45.691 [2024-07-15 07:49:30.404092] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17526d0 name Existed_Raid, state configuring 00:13:45.691 07:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.951 [2024-07-15 07:49:30.600579] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:45.951 [2024-07-15 07:49:30.600598] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:45.951 [2024-07-15 07:49:30.600604] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:45.951 [2024-07-15 07:49:30.600609] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:45.951 [2024-07-15 07:49:30.600613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:45.951 [2024-07-15 07:49:30.600619] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:45.951 07:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:46.211 [2024-07-15 07:49:30.791546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:46.211 BaseBdev1 00:13:46.211 07:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:46.211 07:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:46.211 07:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:46.211 07:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:46.211 07:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:46.211 07:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:46.211 07:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:46.211 07:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:46.472 [ 00:13:46.472 { 00:13:46.472 "name": "BaseBdev1", 00:13:46.472 "aliases": [ 00:13:46.472 "67595205-1bc2-454a-a6f1-6bd51520d26b" 00:13:46.472 ], 00:13:46.472 "product_name": "Malloc disk", 00:13:46.472 "block_size": 512, 00:13:46.472 "num_blocks": 65536, 00:13:46.472 "uuid": "67595205-1bc2-454a-a6f1-6bd51520d26b", 00:13:46.472 "assigned_rate_limits": { 00:13:46.472 "rw_ios_per_sec": 0, 00:13:46.472 "rw_mbytes_per_sec": 0, 00:13:46.472 "r_mbytes_per_sec": 0, 00:13:46.472 "w_mbytes_per_sec": 0 00:13:46.472 }, 00:13:46.472 "claimed": true, 00:13:46.472 "claim_type": "exclusive_write", 00:13:46.472 "zoned": false, 00:13:46.472 "supported_io_types": { 00:13:46.472 "read": true, 00:13:46.472 "write": true, 00:13:46.472 "unmap": true, 00:13:46.472 "flush": true, 00:13:46.472 "reset": true, 00:13:46.472 "nvme_admin": false, 00:13:46.472 "nvme_io": false, 00:13:46.472 "nvme_io_md": false, 00:13:46.472 "write_zeroes": true, 00:13:46.472 "zcopy": true, 00:13:46.472 "get_zone_info": false, 00:13:46.472 "zone_management": false, 00:13:46.472 "zone_append": false, 00:13:46.472 "compare": false, 00:13:46.472 "compare_and_write": false, 00:13:46.472 "abort": true, 00:13:46.472 "seek_hole": false, 00:13:46.472 "seek_data": false, 00:13:46.472 "copy": true, 00:13:46.472 "nvme_iov_md": false 00:13:46.472 }, 00:13:46.472 "memory_domains": [ 00:13:46.472 { 00:13:46.472 "dma_device_id": "system", 00:13:46.472 "dma_device_type": 1 00:13:46.472 }, 00:13:46.472 { 00:13:46.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.472 "dma_device_type": 2 00:13:46.472 } 00:13:46.472 ], 00:13:46.472 "driver_specific": {} 00:13:46.472 } 00:13:46.472 ] 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.473 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.734 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.734 "name": "Existed_Raid", 00:13:46.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.734 "strip_size_kb": 64, 00:13:46.734 "state": "configuring", 00:13:46.734 "raid_level": "concat", 00:13:46.734 "superblock": false, 00:13:46.734 "num_base_bdevs": 3, 00:13:46.734 "num_base_bdevs_discovered": 1, 00:13:46.734 "num_base_bdevs_operational": 3, 00:13:46.734 "base_bdevs_list": [ 00:13:46.734 { 00:13:46.734 "name": "BaseBdev1", 00:13:46.734 "uuid": "67595205-1bc2-454a-a6f1-6bd51520d26b", 00:13:46.734 "is_configured": true, 00:13:46.734 "data_offset": 0, 00:13:46.734 "data_size": 65536 00:13:46.734 }, 00:13:46.734 { 00:13:46.734 "name": "BaseBdev2", 00:13:46.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.734 "is_configured": false, 00:13:46.734 "data_offset": 0, 00:13:46.734 "data_size": 0 00:13:46.734 }, 00:13:46.734 { 00:13:46.734 "name": "BaseBdev3", 00:13:46.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.734 "is_configured": false, 00:13:46.734 "data_offset": 0, 00:13:46.734 "data_size": 0 00:13:46.734 } 00:13:46.734 ] 00:13:46.734 }' 00:13:46.734 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.734 07:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:47.305 07:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:47.566 [2024-07-15 07:49:32.078793] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:47.566 [2024-07-15 07:49:32.078820] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1751fa0 name Existed_Raid, state configuring 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:47.566 [2024-07-15 07:49:32.275311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:47.566 [2024-07-15 07:49:32.276430] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:47.566 [2024-07-15 07:49:32.276458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:47.566 [2024-07-15 07:49:32.276464] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:47.566 [2024-07-15 07:49:32.276470] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.566 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.826 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.826 "name": "Existed_Raid", 00:13:47.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.826 "strip_size_kb": 64, 00:13:47.826 "state": "configuring", 00:13:47.826 "raid_level": "concat", 00:13:47.826 "superblock": false, 00:13:47.826 "num_base_bdevs": 3, 00:13:47.826 "num_base_bdevs_discovered": 1, 00:13:47.826 "num_base_bdevs_operational": 3, 00:13:47.826 "base_bdevs_list": [ 00:13:47.826 { 00:13:47.826 "name": "BaseBdev1", 00:13:47.826 "uuid": "67595205-1bc2-454a-a6f1-6bd51520d26b", 00:13:47.826 "is_configured": true, 00:13:47.826 "data_offset": 0, 00:13:47.826 "data_size": 65536 00:13:47.826 }, 00:13:47.826 { 00:13:47.826 "name": "BaseBdev2", 00:13:47.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.826 "is_configured": false, 00:13:47.826 "data_offset": 0, 00:13:47.827 "data_size": 0 00:13:47.827 }, 00:13:47.827 { 00:13:47.827 "name": "BaseBdev3", 00:13:47.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.827 "is_configured": false, 00:13:47.827 "data_offset": 0, 00:13:47.827 "data_size": 0 00:13:47.827 } 00:13:47.827 ] 00:13:47.827 }' 00:13:47.827 07:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.827 07:49:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.398 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:48.657 [2024-07-15 07:49:33.230640] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:48.657 BaseBdev2 00:13:48.657 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:48.657 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:48.657 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:48.657 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:48.657 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:48.657 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:48.657 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:48.916 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:48.916 [ 00:13:48.916 { 00:13:48.916 "name": "BaseBdev2", 00:13:48.916 "aliases": [ 00:13:48.916 "808ee52b-9281-4da6-9804-d0496eb83eaa" 00:13:48.916 ], 00:13:48.916 "product_name": "Malloc disk", 00:13:48.916 "block_size": 512, 00:13:48.916 "num_blocks": 65536, 00:13:48.916 "uuid": "808ee52b-9281-4da6-9804-d0496eb83eaa", 00:13:48.916 "assigned_rate_limits": { 00:13:48.916 "rw_ios_per_sec": 0, 00:13:48.916 "rw_mbytes_per_sec": 0, 00:13:48.916 "r_mbytes_per_sec": 0, 00:13:48.916 "w_mbytes_per_sec": 0 00:13:48.916 }, 00:13:48.916 "claimed": true, 00:13:48.916 "claim_type": "exclusive_write", 00:13:48.916 "zoned": false, 00:13:48.916 "supported_io_types": { 00:13:48.916 "read": true, 00:13:48.916 "write": true, 00:13:48.916 "unmap": true, 00:13:48.916 "flush": true, 00:13:48.916 "reset": true, 00:13:48.916 "nvme_admin": false, 00:13:48.916 "nvme_io": false, 00:13:48.916 "nvme_io_md": false, 00:13:48.916 "write_zeroes": true, 00:13:48.917 "zcopy": true, 00:13:48.917 "get_zone_info": false, 00:13:48.917 "zone_management": false, 00:13:48.917 "zone_append": false, 00:13:48.917 "compare": false, 00:13:48.917 "compare_and_write": false, 00:13:48.917 "abort": true, 00:13:48.917 "seek_hole": false, 00:13:48.917 "seek_data": false, 00:13:48.917 "copy": true, 00:13:48.917 "nvme_iov_md": false 00:13:48.917 }, 00:13:48.917 "memory_domains": [ 00:13:48.917 { 00:13:48.917 "dma_device_id": "system", 00:13:48.917 "dma_device_type": 1 00:13:48.917 }, 00:13:48.917 { 00:13:48.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.917 "dma_device_type": 2 00:13:48.917 } 00:13:48.917 ], 00:13:48.917 "driver_specific": {} 00:13:48.917 } 00:13:48.917 ] 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.917 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.176 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.176 "name": "Existed_Raid", 00:13:49.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.176 "strip_size_kb": 64, 00:13:49.176 "state": "configuring", 00:13:49.176 "raid_level": "concat", 00:13:49.176 "superblock": false, 00:13:49.176 "num_base_bdevs": 3, 00:13:49.176 "num_base_bdevs_discovered": 2, 00:13:49.176 "num_base_bdevs_operational": 3, 00:13:49.176 "base_bdevs_list": [ 00:13:49.176 { 00:13:49.176 "name": "BaseBdev1", 00:13:49.176 "uuid": "67595205-1bc2-454a-a6f1-6bd51520d26b", 00:13:49.176 "is_configured": true, 00:13:49.176 "data_offset": 0, 00:13:49.176 "data_size": 65536 00:13:49.176 }, 00:13:49.176 { 00:13:49.176 "name": "BaseBdev2", 00:13:49.176 "uuid": "808ee52b-9281-4da6-9804-d0496eb83eaa", 00:13:49.176 "is_configured": true, 00:13:49.176 "data_offset": 0, 00:13:49.176 "data_size": 65536 00:13:49.176 }, 00:13:49.176 { 00:13:49.176 "name": "BaseBdev3", 00:13:49.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.176 "is_configured": false, 00:13:49.176 "data_offset": 0, 00:13:49.176 "data_size": 0 00:13:49.176 } 00:13:49.176 ] 00:13:49.176 }' 00:13:49.176 07:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.176 07:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.745 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:50.005 [2024-07-15 07:49:34.554743] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:50.005 [2024-07-15 07:49:34.554769] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1752e90 00:13:50.005 [2024-07-15 07:49:34.554774] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:50.005 [2024-07-15 07:49:34.554919] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1752b60 00:13:50.005 [2024-07-15 07:49:34.555016] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1752e90 00:13:50.005 [2024-07-15 07:49:34.555021] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1752e90 00:13:50.005 [2024-07-15 07:49:34.555137] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:50.005 BaseBdev3 00:13:50.005 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:50.005 07:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:50.005 07:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:50.005 07:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:50.005 07:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:50.005 07:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:50.005 07:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:50.265 [ 00:13:50.265 { 00:13:50.265 "name": "BaseBdev3", 00:13:50.265 "aliases": [ 00:13:50.265 "c22d6b27-42e3-45cd-9187-af309a8d9d2b" 00:13:50.265 ], 00:13:50.265 "product_name": "Malloc disk", 00:13:50.265 "block_size": 512, 00:13:50.265 "num_blocks": 65536, 00:13:50.265 "uuid": "c22d6b27-42e3-45cd-9187-af309a8d9d2b", 00:13:50.265 "assigned_rate_limits": { 00:13:50.265 "rw_ios_per_sec": 0, 00:13:50.265 "rw_mbytes_per_sec": 0, 00:13:50.265 "r_mbytes_per_sec": 0, 00:13:50.265 "w_mbytes_per_sec": 0 00:13:50.265 }, 00:13:50.265 "claimed": true, 00:13:50.265 "claim_type": "exclusive_write", 00:13:50.265 "zoned": false, 00:13:50.265 "supported_io_types": { 00:13:50.265 "read": true, 00:13:50.265 "write": true, 00:13:50.265 "unmap": true, 00:13:50.265 "flush": true, 00:13:50.265 "reset": true, 00:13:50.265 "nvme_admin": false, 00:13:50.265 "nvme_io": false, 00:13:50.265 "nvme_io_md": false, 00:13:50.265 "write_zeroes": true, 00:13:50.265 "zcopy": true, 00:13:50.265 "get_zone_info": false, 00:13:50.265 "zone_management": false, 00:13:50.265 "zone_append": false, 00:13:50.265 "compare": false, 00:13:50.265 "compare_and_write": false, 00:13:50.265 "abort": true, 00:13:50.265 "seek_hole": false, 00:13:50.265 "seek_data": false, 00:13:50.265 "copy": true, 00:13:50.265 "nvme_iov_md": false 00:13:50.265 }, 00:13:50.265 "memory_domains": [ 00:13:50.265 { 00:13:50.265 "dma_device_id": "system", 00:13:50.265 "dma_device_type": 1 00:13:50.265 }, 00:13:50.265 { 00:13:50.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.265 "dma_device_type": 2 00:13:50.265 } 00:13:50.265 ], 00:13:50.265 "driver_specific": {} 00:13:50.265 } 00:13:50.265 ] 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.265 07:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.525 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.525 "name": "Existed_Raid", 00:13:50.525 "uuid": "7b1e8412-3731-477e-b470-d058d70f032d", 00:13:50.525 "strip_size_kb": 64, 00:13:50.525 "state": "online", 00:13:50.525 "raid_level": "concat", 00:13:50.525 "superblock": false, 00:13:50.525 "num_base_bdevs": 3, 00:13:50.525 "num_base_bdevs_discovered": 3, 00:13:50.525 "num_base_bdevs_operational": 3, 00:13:50.525 "base_bdevs_list": [ 00:13:50.525 { 00:13:50.525 "name": "BaseBdev1", 00:13:50.525 "uuid": "67595205-1bc2-454a-a6f1-6bd51520d26b", 00:13:50.525 "is_configured": true, 00:13:50.525 "data_offset": 0, 00:13:50.525 "data_size": 65536 00:13:50.525 }, 00:13:50.525 { 00:13:50.525 "name": "BaseBdev2", 00:13:50.525 "uuid": "808ee52b-9281-4da6-9804-d0496eb83eaa", 00:13:50.525 "is_configured": true, 00:13:50.525 "data_offset": 0, 00:13:50.525 "data_size": 65536 00:13:50.525 }, 00:13:50.525 { 00:13:50.525 "name": "BaseBdev3", 00:13:50.525 "uuid": "c22d6b27-42e3-45cd-9187-af309a8d9d2b", 00:13:50.525 "is_configured": true, 00:13:50.525 "data_offset": 0, 00:13:50.525 "data_size": 65536 00:13:50.525 } 00:13:50.525 ] 00:13:50.525 }' 00:13:50.525 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.525 07:49:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.095 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:51.095 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:51.095 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:51.095 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:51.095 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:51.095 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:51.095 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:51.095 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:51.095 [2024-07-15 07:49:35.846258] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:51.355 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:51.355 "name": "Existed_Raid", 00:13:51.355 "aliases": [ 00:13:51.355 "7b1e8412-3731-477e-b470-d058d70f032d" 00:13:51.355 ], 00:13:51.355 "product_name": "Raid Volume", 00:13:51.355 "block_size": 512, 00:13:51.355 "num_blocks": 196608, 00:13:51.355 "uuid": "7b1e8412-3731-477e-b470-d058d70f032d", 00:13:51.355 "assigned_rate_limits": { 00:13:51.355 "rw_ios_per_sec": 0, 00:13:51.355 "rw_mbytes_per_sec": 0, 00:13:51.355 "r_mbytes_per_sec": 0, 00:13:51.355 "w_mbytes_per_sec": 0 00:13:51.355 }, 00:13:51.355 "claimed": false, 00:13:51.355 "zoned": false, 00:13:51.355 "supported_io_types": { 00:13:51.355 "read": true, 00:13:51.355 "write": true, 00:13:51.355 "unmap": true, 00:13:51.355 "flush": true, 00:13:51.355 "reset": true, 00:13:51.355 "nvme_admin": false, 00:13:51.355 "nvme_io": false, 00:13:51.355 "nvme_io_md": false, 00:13:51.355 "write_zeroes": true, 00:13:51.355 "zcopy": false, 00:13:51.355 "get_zone_info": false, 00:13:51.355 "zone_management": false, 00:13:51.355 "zone_append": false, 00:13:51.355 "compare": false, 00:13:51.355 "compare_and_write": false, 00:13:51.355 "abort": false, 00:13:51.355 "seek_hole": false, 00:13:51.355 "seek_data": false, 00:13:51.355 "copy": false, 00:13:51.355 "nvme_iov_md": false 00:13:51.355 }, 00:13:51.355 "memory_domains": [ 00:13:51.355 { 00:13:51.355 "dma_device_id": "system", 00:13:51.355 "dma_device_type": 1 00:13:51.355 }, 00:13:51.355 { 00:13:51.355 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.355 "dma_device_type": 2 00:13:51.355 }, 00:13:51.355 { 00:13:51.355 "dma_device_id": "system", 00:13:51.355 "dma_device_type": 1 00:13:51.355 }, 00:13:51.355 { 00:13:51.355 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.355 "dma_device_type": 2 00:13:51.355 }, 00:13:51.355 { 00:13:51.355 "dma_device_id": "system", 00:13:51.355 "dma_device_type": 1 00:13:51.355 }, 00:13:51.355 { 00:13:51.355 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.355 "dma_device_type": 2 00:13:51.355 } 00:13:51.355 ], 00:13:51.356 "driver_specific": { 00:13:51.356 "raid": { 00:13:51.356 "uuid": "7b1e8412-3731-477e-b470-d058d70f032d", 00:13:51.356 "strip_size_kb": 64, 00:13:51.356 "state": "online", 00:13:51.356 "raid_level": "concat", 00:13:51.356 "superblock": false, 00:13:51.356 "num_base_bdevs": 3, 00:13:51.356 "num_base_bdevs_discovered": 3, 00:13:51.356 "num_base_bdevs_operational": 3, 00:13:51.356 "base_bdevs_list": [ 00:13:51.356 { 00:13:51.356 "name": "BaseBdev1", 00:13:51.356 "uuid": "67595205-1bc2-454a-a6f1-6bd51520d26b", 00:13:51.356 "is_configured": true, 00:13:51.356 "data_offset": 0, 00:13:51.356 "data_size": 65536 00:13:51.356 }, 00:13:51.356 { 00:13:51.356 "name": "BaseBdev2", 00:13:51.356 "uuid": "808ee52b-9281-4da6-9804-d0496eb83eaa", 00:13:51.356 "is_configured": true, 00:13:51.356 "data_offset": 0, 00:13:51.356 "data_size": 65536 00:13:51.356 }, 00:13:51.356 { 00:13:51.356 "name": "BaseBdev3", 00:13:51.356 "uuid": "c22d6b27-42e3-45cd-9187-af309a8d9d2b", 00:13:51.356 "is_configured": true, 00:13:51.356 "data_offset": 0, 00:13:51.356 "data_size": 65536 00:13:51.356 } 00:13:51.356 ] 00:13:51.356 } 00:13:51.356 } 00:13:51.356 }' 00:13:51.356 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:51.356 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:51.356 BaseBdev2 00:13:51.356 BaseBdev3' 00:13:51.356 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:51.356 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:51.356 07:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:51.356 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:51.356 "name": "BaseBdev1", 00:13:51.356 "aliases": [ 00:13:51.356 "67595205-1bc2-454a-a6f1-6bd51520d26b" 00:13:51.356 ], 00:13:51.356 "product_name": "Malloc disk", 00:13:51.356 "block_size": 512, 00:13:51.356 "num_blocks": 65536, 00:13:51.356 "uuid": "67595205-1bc2-454a-a6f1-6bd51520d26b", 00:13:51.356 "assigned_rate_limits": { 00:13:51.356 "rw_ios_per_sec": 0, 00:13:51.356 "rw_mbytes_per_sec": 0, 00:13:51.356 "r_mbytes_per_sec": 0, 00:13:51.356 "w_mbytes_per_sec": 0 00:13:51.356 }, 00:13:51.356 "claimed": true, 00:13:51.356 "claim_type": "exclusive_write", 00:13:51.356 "zoned": false, 00:13:51.356 "supported_io_types": { 00:13:51.356 "read": true, 00:13:51.356 "write": true, 00:13:51.356 "unmap": true, 00:13:51.356 "flush": true, 00:13:51.356 "reset": true, 00:13:51.356 "nvme_admin": false, 00:13:51.356 "nvme_io": false, 00:13:51.356 "nvme_io_md": false, 00:13:51.356 "write_zeroes": true, 00:13:51.356 "zcopy": true, 00:13:51.356 "get_zone_info": false, 00:13:51.356 "zone_management": false, 00:13:51.356 "zone_append": false, 00:13:51.356 "compare": false, 00:13:51.356 "compare_and_write": false, 00:13:51.356 "abort": true, 00:13:51.356 "seek_hole": false, 00:13:51.356 "seek_data": false, 00:13:51.356 "copy": true, 00:13:51.356 "nvme_iov_md": false 00:13:51.356 }, 00:13:51.356 "memory_domains": [ 00:13:51.356 { 00:13:51.356 "dma_device_id": "system", 00:13:51.356 "dma_device_type": 1 00:13:51.356 }, 00:13:51.356 { 00:13:51.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.356 "dma_device_type": 2 00:13:51.356 } 00:13:51.356 ], 00:13:51.356 "driver_specific": {} 00:13:51.356 }' 00:13:51.356 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:51.630 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:51.938 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:51.939 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:51.939 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:51.939 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:51.939 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:51.939 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:51.939 "name": "BaseBdev2", 00:13:51.939 "aliases": [ 00:13:51.939 "808ee52b-9281-4da6-9804-d0496eb83eaa" 00:13:51.939 ], 00:13:51.939 "product_name": "Malloc disk", 00:13:51.939 "block_size": 512, 00:13:51.939 "num_blocks": 65536, 00:13:51.939 "uuid": "808ee52b-9281-4da6-9804-d0496eb83eaa", 00:13:51.939 "assigned_rate_limits": { 00:13:51.939 "rw_ios_per_sec": 0, 00:13:51.939 "rw_mbytes_per_sec": 0, 00:13:51.939 "r_mbytes_per_sec": 0, 00:13:51.939 "w_mbytes_per_sec": 0 00:13:51.939 }, 00:13:51.939 "claimed": true, 00:13:51.939 "claim_type": "exclusive_write", 00:13:51.939 "zoned": false, 00:13:51.939 "supported_io_types": { 00:13:51.939 "read": true, 00:13:51.939 "write": true, 00:13:51.939 "unmap": true, 00:13:51.939 "flush": true, 00:13:51.939 "reset": true, 00:13:51.939 "nvme_admin": false, 00:13:51.939 "nvme_io": false, 00:13:51.939 "nvme_io_md": false, 00:13:51.939 "write_zeroes": true, 00:13:51.939 "zcopy": true, 00:13:51.939 "get_zone_info": false, 00:13:51.939 "zone_management": false, 00:13:51.939 "zone_append": false, 00:13:51.939 "compare": false, 00:13:51.939 "compare_and_write": false, 00:13:51.939 "abort": true, 00:13:51.939 "seek_hole": false, 00:13:51.939 "seek_data": false, 00:13:51.939 "copy": true, 00:13:51.939 "nvme_iov_md": false 00:13:51.939 }, 00:13:51.939 "memory_domains": [ 00:13:51.939 { 00:13:51.939 "dma_device_id": "system", 00:13:51.939 "dma_device_type": 1 00:13:51.939 }, 00:13:51.939 { 00:13:51.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.939 "dma_device_type": 2 00:13:51.939 } 00:13:51.939 ], 00:13:51.939 "driver_specific": {} 00:13:51.939 }' 00:13:51.939 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.939 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.207 07:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:52.466 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.466 "name": "BaseBdev3", 00:13:52.466 "aliases": [ 00:13:52.466 "c22d6b27-42e3-45cd-9187-af309a8d9d2b" 00:13:52.466 ], 00:13:52.466 "product_name": "Malloc disk", 00:13:52.466 "block_size": 512, 00:13:52.466 "num_blocks": 65536, 00:13:52.466 "uuid": "c22d6b27-42e3-45cd-9187-af309a8d9d2b", 00:13:52.466 "assigned_rate_limits": { 00:13:52.466 "rw_ios_per_sec": 0, 00:13:52.466 "rw_mbytes_per_sec": 0, 00:13:52.466 "r_mbytes_per_sec": 0, 00:13:52.466 "w_mbytes_per_sec": 0 00:13:52.466 }, 00:13:52.466 "claimed": true, 00:13:52.466 "claim_type": "exclusive_write", 00:13:52.466 "zoned": false, 00:13:52.466 "supported_io_types": { 00:13:52.466 "read": true, 00:13:52.466 "write": true, 00:13:52.466 "unmap": true, 00:13:52.466 "flush": true, 00:13:52.466 "reset": true, 00:13:52.466 "nvme_admin": false, 00:13:52.466 "nvme_io": false, 00:13:52.466 "nvme_io_md": false, 00:13:52.466 "write_zeroes": true, 00:13:52.466 "zcopy": true, 00:13:52.466 "get_zone_info": false, 00:13:52.466 "zone_management": false, 00:13:52.466 "zone_append": false, 00:13:52.466 "compare": false, 00:13:52.466 "compare_and_write": false, 00:13:52.466 "abort": true, 00:13:52.466 "seek_hole": false, 00:13:52.466 "seek_data": false, 00:13:52.466 "copy": true, 00:13:52.466 "nvme_iov_md": false 00:13:52.466 }, 00:13:52.466 "memory_domains": [ 00:13:52.466 { 00:13:52.466 "dma_device_id": "system", 00:13:52.466 "dma_device_type": 1 00:13:52.466 }, 00:13:52.466 { 00:13:52.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.466 "dma_device_type": 2 00:13:52.466 } 00:13:52.466 ], 00:13:52.466 "driver_specific": {} 00:13:52.466 }' 00:13:52.466 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.466 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.727 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.987 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.987 07:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:53.248 [2024-07-15 07:49:37.991475] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:53.248 [2024-07-15 07:49:37.991493] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:53.248 [2024-07-15 07:49:37.991523] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.507 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.507 "name": "Existed_Raid", 00:13:53.507 "uuid": "7b1e8412-3731-477e-b470-d058d70f032d", 00:13:53.507 "strip_size_kb": 64, 00:13:53.507 "state": "offline", 00:13:53.507 "raid_level": "concat", 00:13:53.507 "superblock": false, 00:13:53.507 "num_base_bdevs": 3, 00:13:53.507 "num_base_bdevs_discovered": 2, 00:13:53.507 "num_base_bdevs_operational": 2, 00:13:53.507 "base_bdevs_list": [ 00:13:53.507 { 00:13:53.507 "name": null, 00:13:53.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.507 "is_configured": false, 00:13:53.507 "data_offset": 0, 00:13:53.507 "data_size": 65536 00:13:53.507 }, 00:13:53.507 { 00:13:53.507 "name": "BaseBdev2", 00:13:53.508 "uuid": "808ee52b-9281-4da6-9804-d0496eb83eaa", 00:13:53.508 "is_configured": true, 00:13:53.508 "data_offset": 0, 00:13:53.508 "data_size": 65536 00:13:53.508 }, 00:13:53.508 { 00:13:53.508 "name": "BaseBdev3", 00:13:53.508 "uuid": "c22d6b27-42e3-45cd-9187-af309a8d9d2b", 00:13:53.508 "is_configured": true, 00:13:53.508 "data_offset": 0, 00:13:53.508 "data_size": 65536 00:13:53.508 } 00:13:53.508 ] 00:13:53.508 }' 00:13:53.508 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.508 07:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.078 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:54.078 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.078 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.078 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:54.338 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:54.338 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:54.338 07:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:54.598 [2024-07-15 07:49:39.130379] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:54.598 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:54.598 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.598 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.598 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:54.598 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:54.598 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:54.598 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:54.858 [2024-07-15 07:49:39.517171] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:54.858 [2024-07-15 07:49:39.517201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1752e90 name Existed_Raid, state offline 00:13:54.858 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:54.858 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.858 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.858 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:55.118 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:55.118 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:55.118 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:55.118 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:55.118 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:55.118 07:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:55.689 BaseBdev2 00:13:55.689 07:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:55.689 07:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:55.689 07:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:55.689 07:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:55.689 07:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:55.689 07:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:55.689 07:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.259 07:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:56.829 [ 00:13:56.829 { 00:13:56.829 "name": "BaseBdev2", 00:13:56.829 "aliases": [ 00:13:56.830 "4ee22017-db3c-45df-92f1-b889d796fbae" 00:13:56.830 ], 00:13:56.830 "product_name": "Malloc disk", 00:13:56.830 "block_size": 512, 00:13:56.830 "num_blocks": 65536, 00:13:56.830 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:13:56.830 "assigned_rate_limits": { 00:13:56.830 "rw_ios_per_sec": 0, 00:13:56.830 "rw_mbytes_per_sec": 0, 00:13:56.830 "r_mbytes_per_sec": 0, 00:13:56.830 "w_mbytes_per_sec": 0 00:13:56.830 }, 00:13:56.830 "claimed": false, 00:13:56.830 "zoned": false, 00:13:56.830 "supported_io_types": { 00:13:56.830 "read": true, 00:13:56.830 "write": true, 00:13:56.830 "unmap": true, 00:13:56.830 "flush": true, 00:13:56.830 "reset": true, 00:13:56.830 "nvme_admin": false, 00:13:56.830 "nvme_io": false, 00:13:56.830 "nvme_io_md": false, 00:13:56.830 "write_zeroes": true, 00:13:56.830 "zcopy": true, 00:13:56.830 "get_zone_info": false, 00:13:56.830 "zone_management": false, 00:13:56.830 "zone_append": false, 00:13:56.830 "compare": false, 00:13:56.830 "compare_and_write": false, 00:13:56.830 "abort": true, 00:13:56.830 "seek_hole": false, 00:13:56.830 "seek_data": false, 00:13:56.830 "copy": true, 00:13:56.830 "nvme_iov_md": false 00:13:56.830 }, 00:13:56.830 "memory_domains": [ 00:13:56.830 { 00:13:56.830 "dma_device_id": "system", 00:13:56.830 "dma_device_type": 1 00:13:56.830 }, 00:13:56.830 { 00:13:56.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.830 "dma_device_type": 2 00:13:56.830 } 00:13:56.830 ], 00:13:56.830 "driver_specific": {} 00:13:56.830 } 00:13:56.830 ] 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:56.830 BaseBdev3 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:56.830 07:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:57.090 07:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:57.659 [ 00:13:57.659 { 00:13:57.659 "name": "BaseBdev3", 00:13:57.659 "aliases": [ 00:13:57.659 "145d4c01-66fc-4efd-b40d-edd3884c866a" 00:13:57.659 ], 00:13:57.659 "product_name": "Malloc disk", 00:13:57.659 "block_size": 512, 00:13:57.659 "num_blocks": 65536, 00:13:57.659 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:13:57.659 "assigned_rate_limits": { 00:13:57.659 "rw_ios_per_sec": 0, 00:13:57.659 "rw_mbytes_per_sec": 0, 00:13:57.659 "r_mbytes_per_sec": 0, 00:13:57.659 "w_mbytes_per_sec": 0 00:13:57.659 }, 00:13:57.659 "claimed": false, 00:13:57.659 "zoned": false, 00:13:57.659 "supported_io_types": { 00:13:57.659 "read": true, 00:13:57.659 "write": true, 00:13:57.659 "unmap": true, 00:13:57.659 "flush": true, 00:13:57.659 "reset": true, 00:13:57.659 "nvme_admin": false, 00:13:57.659 "nvme_io": false, 00:13:57.659 "nvme_io_md": false, 00:13:57.659 "write_zeroes": true, 00:13:57.659 "zcopy": true, 00:13:57.659 "get_zone_info": false, 00:13:57.659 "zone_management": false, 00:13:57.659 "zone_append": false, 00:13:57.659 "compare": false, 00:13:57.659 "compare_and_write": false, 00:13:57.659 "abort": true, 00:13:57.659 "seek_hole": false, 00:13:57.659 "seek_data": false, 00:13:57.659 "copy": true, 00:13:57.659 "nvme_iov_md": false 00:13:57.659 }, 00:13:57.659 "memory_domains": [ 00:13:57.659 { 00:13:57.659 "dma_device_id": "system", 00:13:57.659 "dma_device_type": 1 00:13:57.659 }, 00:13:57.659 { 00:13:57.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.659 "dma_device_type": 2 00:13:57.659 } 00:13:57.659 ], 00:13:57.659 "driver_specific": {} 00:13:57.659 } 00:13:57.659 ] 00:13:57.659 07:49:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:57.659 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:57.659 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:57.659 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:58.230 [2024-07-15 07:49:42.817271] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:58.230 [2024-07-15 07:49:42.817302] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:58.230 [2024-07-15 07:49:42.817315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:58.230 [2024-07-15 07:49:42.818370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.230 07:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.490 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.490 "name": "Existed_Raid", 00:13:58.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.490 "strip_size_kb": 64, 00:13:58.490 "state": "configuring", 00:13:58.490 "raid_level": "concat", 00:13:58.490 "superblock": false, 00:13:58.490 "num_base_bdevs": 3, 00:13:58.490 "num_base_bdevs_discovered": 2, 00:13:58.490 "num_base_bdevs_operational": 3, 00:13:58.490 "base_bdevs_list": [ 00:13:58.490 { 00:13:58.490 "name": "BaseBdev1", 00:13:58.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.490 "is_configured": false, 00:13:58.490 "data_offset": 0, 00:13:58.490 "data_size": 0 00:13:58.490 }, 00:13:58.490 { 00:13:58.490 "name": "BaseBdev2", 00:13:58.490 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:13:58.490 "is_configured": true, 00:13:58.490 "data_offset": 0, 00:13:58.490 "data_size": 65536 00:13:58.490 }, 00:13:58.490 { 00:13:58.490 "name": "BaseBdev3", 00:13:58.490 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:13:58.490 "is_configured": true, 00:13:58.490 "data_offset": 0, 00:13:58.490 "data_size": 65536 00:13:58.490 } 00:13:58.490 ] 00:13:58.490 }' 00:13:58.490 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.490 07:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:59.058 [2024-07-15 07:49:43.755613] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.058 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.317 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.317 "name": "Existed_Raid", 00:13:59.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.317 "strip_size_kb": 64, 00:13:59.317 "state": "configuring", 00:13:59.317 "raid_level": "concat", 00:13:59.317 "superblock": false, 00:13:59.317 "num_base_bdevs": 3, 00:13:59.317 "num_base_bdevs_discovered": 1, 00:13:59.317 "num_base_bdevs_operational": 3, 00:13:59.317 "base_bdevs_list": [ 00:13:59.317 { 00:13:59.317 "name": "BaseBdev1", 00:13:59.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.317 "is_configured": false, 00:13:59.317 "data_offset": 0, 00:13:59.317 "data_size": 0 00:13:59.317 }, 00:13:59.317 { 00:13:59.317 "name": null, 00:13:59.318 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:13:59.318 "is_configured": false, 00:13:59.318 "data_offset": 0, 00:13:59.318 "data_size": 65536 00:13:59.318 }, 00:13:59.318 { 00:13:59.318 "name": "BaseBdev3", 00:13:59.318 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:13:59.318 "is_configured": true, 00:13:59.318 "data_offset": 0, 00:13:59.318 "data_size": 65536 00:13:59.318 } 00:13:59.318 ] 00:13:59.318 }' 00:13:59.318 07:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.318 07:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.888 07:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.888 07:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:00.148 07:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:00.148 07:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:00.148 [2024-07-15 07:49:44.899574] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:00.148 BaseBdev1 00:14:00.410 07:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:00.410 07:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:00.410 07:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:00.410 07:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:00.410 07:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:00.410 07:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:00.410 07:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:00.981 [ 00:14:00.981 { 00:14:00.981 "name": "BaseBdev1", 00:14:00.981 "aliases": [ 00:14:00.981 "738cf5c0-dc66-41ce-aaa6-b1f72208f98c" 00:14:00.981 ], 00:14:00.981 "product_name": "Malloc disk", 00:14:00.981 "block_size": 512, 00:14:00.981 "num_blocks": 65536, 00:14:00.981 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:00.981 "assigned_rate_limits": { 00:14:00.981 "rw_ios_per_sec": 0, 00:14:00.981 "rw_mbytes_per_sec": 0, 00:14:00.981 "r_mbytes_per_sec": 0, 00:14:00.981 "w_mbytes_per_sec": 0 00:14:00.981 }, 00:14:00.981 "claimed": true, 00:14:00.981 "claim_type": "exclusive_write", 00:14:00.981 "zoned": false, 00:14:00.981 "supported_io_types": { 00:14:00.981 "read": true, 00:14:00.981 "write": true, 00:14:00.981 "unmap": true, 00:14:00.981 "flush": true, 00:14:00.981 "reset": true, 00:14:00.981 "nvme_admin": false, 00:14:00.981 "nvme_io": false, 00:14:00.981 "nvme_io_md": false, 00:14:00.981 "write_zeroes": true, 00:14:00.981 "zcopy": true, 00:14:00.981 "get_zone_info": false, 00:14:00.981 "zone_management": false, 00:14:00.981 "zone_append": false, 00:14:00.981 "compare": false, 00:14:00.981 "compare_and_write": false, 00:14:00.981 "abort": true, 00:14:00.981 "seek_hole": false, 00:14:00.981 "seek_data": false, 00:14:00.981 "copy": true, 00:14:00.981 "nvme_iov_md": false 00:14:00.981 }, 00:14:00.981 "memory_domains": [ 00:14:00.981 { 00:14:00.981 "dma_device_id": "system", 00:14:00.981 "dma_device_type": 1 00:14:00.981 }, 00:14:00.981 { 00:14:00.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.981 "dma_device_type": 2 00:14:00.981 } 00:14:00.981 ], 00:14:00.981 "driver_specific": {} 00:14:00.981 } 00:14:00.981 ] 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.981 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.982 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.982 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.242 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.242 "name": "Existed_Raid", 00:14:01.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.242 "strip_size_kb": 64, 00:14:01.242 "state": "configuring", 00:14:01.242 "raid_level": "concat", 00:14:01.242 "superblock": false, 00:14:01.242 "num_base_bdevs": 3, 00:14:01.242 "num_base_bdevs_discovered": 2, 00:14:01.242 "num_base_bdevs_operational": 3, 00:14:01.242 "base_bdevs_list": [ 00:14:01.242 { 00:14:01.242 "name": "BaseBdev1", 00:14:01.242 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:01.242 "is_configured": true, 00:14:01.242 "data_offset": 0, 00:14:01.242 "data_size": 65536 00:14:01.242 }, 00:14:01.242 { 00:14:01.242 "name": null, 00:14:01.242 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:14:01.242 "is_configured": false, 00:14:01.242 "data_offset": 0, 00:14:01.242 "data_size": 65536 00:14:01.242 }, 00:14:01.242 { 00:14:01.242 "name": "BaseBdev3", 00:14:01.242 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:14:01.242 "is_configured": true, 00:14:01.242 "data_offset": 0, 00:14:01.242 "data_size": 65536 00:14:01.242 } 00:14:01.242 ] 00:14:01.242 }' 00:14:01.242 07:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.242 07:49:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.814 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.814 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:02.075 [2024-07-15 07:49:46.768307] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.075 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.336 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.336 "name": "Existed_Raid", 00:14:02.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:02.336 "strip_size_kb": 64, 00:14:02.336 "state": "configuring", 00:14:02.336 "raid_level": "concat", 00:14:02.336 "superblock": false, 00:14:02.336 "num_base_bdevs": 3, 00:14:02.336 "num_base_bdevs_discovered": 1, 00:14:02.336 "num_base_bdevs_operational": 3, 00:14:02.336 "base_bdevs_list": [ 00:14:02.336 { 00:14:02.336 "name": "BaseBdev1", 00:14:02.336 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:02.336 "is_configured": true, 00:14:02.336 "data_offset": 0, 00:14:02.336 "data_size": 65536 00:14:02.336 }, 00:14:02.336 { 00:14:02.336 "name": null, 00:14:02.336 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:14:02.336 "is_configured": false, 00:14:02.336 "data_offset": 0, 00:14:02.336 "data_size": 65536 00:14:02.336 }, 00:14:02.336 { 00:14:02.336 "name": null, 00:14:02.336 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:14:02.336 "is_configured": false, 00:14:02.336 "data_offset": 0, 00:14:02.336 "data_size": 65536 00:14:02.336 } 00:14:02.336 ] 00:14:02.336 }' 00:14:02.336 07:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.336 07:49:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.907 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.907 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:03.167 [2024-07-15 07:49:47.899184] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.167 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.168 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.168 07:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.428 07:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.428 "name": "Existed_Raid", 00:14:03.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.428 "strip_size_kb": 64, 00:14:03.428 "state": "configuring", 00:14:03.428 "raid_level": "concat", 00:14:03.428 "superblock": false, 00:14:03.428 "num_base_bdevs": 3, 00:14:03.428 "num_base_bdevs_discovered": 2, 00:14:03.428 "num_base_bdevs_operational": 3, 00:14:03.428 "base_bdevs_list": [ 00:14:03.428 { 00:14:03.428 "name": "BaseBdev1", 00:14:03.428 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:03.428 "is_configured": true, 00:14:03.428 "data_offset": 0, 00:14:03.428 "data_size": 65536 00:14:03.428 }, 00:14:03.428 { 00:14:03.428 "name": null, 00:14:03.428 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:14:03.428 "is_configured": false, 00:14:03.428 "data_offset": 0, 00:14:03.428 "data_size": 65536 00:14:03.428 }, 00:14:03.428 { 00:14:03.428 "name": "BaseBdev3", 00:14:03.428 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:14:03.428 "is_configured": true, 00:14:03.428 "data_offset": 0, 00:14:03.428 "data_size": 65536 00:14:03.428 } 00:14:03.428 ] 00:14:03.428 }' 00:14:03.428 07:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.428 07:49:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.999 07:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.999 07:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:04.259 07:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:04.259 07:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:04.520 [2024-07-15 07:49:49.018013] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.520 "name": "Existed_Raid", 00:14:04.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.520 "strip_size_kb": 64, 00:14:04.520 "state": "configuring", 00:14:04.520 "raid_level": "concat", 00:14:04.520 "superblock": false, 00:14:04.520 "num_base_bdevs": 3, 00:14:04.520 "num_base_bdevs_discovered": 1, 00:14:04.520 "num_base_bdevs_operational": 3, 00:14:04.520 "base_bdevs_list": [ 00:14:04.520 { 00:14:04.520 "name": null, 00:14:04.520 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:04.520 "is_configured": false, 00:14:04.520 "data_offset": 0, 00:14:04.520 "data_size": 65536 00:14:04.520 }, 00:14:04.520 { 00:14:04.520 "name": null, 00:14:04.520 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:14:04.520 "is_configured": false, 00:14:04.520 "data_offset": 0, 00:14:04.520 "data_size": 65536 00:14:04.520 }, 00:14:04.520 { 00:14:04.520 "name": "BaseBdev3", 00:14:04.520 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:14:04.520 "is_configured": true, 00:14:04.520 "data_offset": 0, 00:14:04.520 "data_size": 65536 00:14:04.520 } 00:14:04.520 ] 00:14:04.520 }' 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.520 07:49:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.091 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.091 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:05.351 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:05.351 07:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:05.611 [2024-07-15 07:49:50.118868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.611 "name": "Existed_Raid", 00:14:05.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.611 "strip_size_kb": 64, 00:14:05.611 "state": "configuring", 00:14:05.611 "raid_level": "concat", 00:14:05.611 "superblock": false, 00:14:05.611 "num_base_bdevs": 3, 00:14:05.611 "num_base_bdevs_discovered": 2, 00:14:05.611 "num_base_bdevs_operational": 3, 00:14:05.611 "base_bdevs_list": [ 00:14:05.611 { 00:14:05.611 "name": null, 00:14:05.611 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:05.611 "is_configured": false, 00:14:05.611 "data_offset": 0, 00:14:05.611 "data_size": 65536 00:14:05.611 }, 00:14:05.611 { 00:14:05.611 "name": "BaseBdev2", 00:14:05.611 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:14:05.611 "is_configured": true, 00:14:05.611 "data_offset": 0, 00:14:05.611 "data_size": 65536 00:14:05.611 }, 00:14:05.611 { 00:14:05.611 "name": "BaseBdev3", 00:14:05.611 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:14:05.611 "is_configured": true, 00:14:05.611 "data_offset": 0, 00:14:05.611 "data_size": 65536 00:14:05.611 } 00:14:05.611 ] 00:14:05.611 }' 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.611 07:49:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.182 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.182 07:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:06.441 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:06.441 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.441 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:06.700 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 738cf5c0-dc66-41ce-aaa6-b1f72208f98c 00:14:06.960 [2024-07-15 07:49:51.471137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:06.960 [2024-07-15 07:49:51.471165] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1751250 00:14:06.960 [2024-07-15 07:49:51.471170] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:06.960 [2024-07-15 07:49:51.471321] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17526a0 00:14:06.960 [2024-07-15 07:49:51.471411] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1751250 00:14:06.960 [2024-07-15 07:49:51.471417] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1751250 00:14:06.960 [2024-07-15 07:49:51.471534] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:06.960 NewBaseBdev 00:14:06.960 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:06.960 07:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:06.960 07:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:06.960 07:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:06.960 07:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:06.960 07:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:06.960 07:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:06.960 07:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:07.220 [ 00:14:07.220 { 00:14:07.220 "name": "NewBaseBdev", 00:14:07.220 "aliases": [ 00:14:07.220 "738cf5c0-dc66-41ce-aaa6-b1f72208f98c" 00:14:07.220 ], 00:14:07.220 "product_name": "Malloc disk", 00:14:07.220 "block_size": 512, 00:14:07.220 "num_blocks": 65536, 00:14:07.220 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:07.220 "assigned_rate_limits": { 00:14:07.220 "rw_ios_per_sec": 0, 00:14:07.220 "rw_mbytes_per_sec": 0, 00:14:07.220 "r_mbytes_per_sec": 0, 00:14:07.220 "w_mbytes_per_sec": 0 00:14:07.220 }, 00:14:07.220 "claimed": true, 00:14:07.220 "claim_type": "exclusive_write", 00:14:07.220 "zoned": false, 00:14:07.220 "supported_io_types": { 00:14:07.220 "read": true, 00:14:07.220 "write": true, 00:14:07.220 "unmap": true, 00:14:07.220 "flush": true, 00:14:07.220 "reset": true, 00:14:07.220 "nvme_admin": false, 00:14:07.220 "nvme_io": false, 00:14:07.220 "nvme_io_md": false, 00:14:07.220 "write_zeroes": true, 00:14:07.220 "zcopy": true, 00:14:07.220 "get_zone_info": false, 00:14:07.220 "zone_management": false, 00:14:07.220 "zone_append": false, 00:14:07.220 "compare": false, 00:14:07.220 "compare_and_write": false, 00:14:07.220 "abort": true, 00:14:07.220 "seek_hole": false, 00:14:07.220 "seek_data": false, 00:14:07.220 "copy": true, 00:14:07.220 "nvme_iov_md": false 00:14:07.220 }, 00:14:07.220 "memory_domains": [ 00:14:07.220 { 00:14:07.220 "dma_device_id": "system", 00:14:07.220 "dma_device_type": 1 00:14:07.220 }, 00:14:07.220 { 00:14:07.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.220 "dma_device_type": 2 00:14:07.220 } 00:14:07.220 ], 00:14:07.220 "driver_specific": {} 00:14:07.220 } 00:14:07.220 ] 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.220 07:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.481 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.481 "name": "Existed_Raid", 00:14:07.481 "uuid": "a1deda28-23ef-4b68-a842-6ae436b7ec9a", 00:14:07.481 "strip_size_kb": 64, 00:14:07.481 "state": "online", 00:14:07.481 "raid_level": "concat", 00:14:07.481 "superblock": false, 00:14:07.481 "num_base_bdevs": 3, 00:14:07.481 "num_base_bdevs_discovered": 3, 00:14:07.481 "num_base_bdevs_operational": 3, 00:14:07.481 "base_bdevs_list": [ 00:14:07.481 { 00:14:07.481 "name": "NewBaseBdev", 00:14:07.481 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:07.481 "is_configured": true, 00:14:07.481 "data_offset": 0, 00:14:07.481 "data_size": 65536 00:14:07.481 }, 00:14:07.481 { 00:14:07.481 "name": "BaseBdev2", 00:14:07.481 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:14:07.481 "is_configured": true, 00:14:07.481 "data_offset": 0, 00:14:07.481 "data_size": 65536 00:14:07.481 }, 00:14:07.481 { 00:14:07.481 "name": "BaseBdev3", 00:14:07.481 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:14:07.481 "is_configured": true, 00:14:07.481 "data_offset": 0, 00:14:07.481 "data_size": 65536 00:14:07.481 } 00:14:07.481 ] 00:14:07.481 }' 00:14:07.481 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.481 07:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:08.050 [2024-07-15 07:49:52.738610] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:08.050 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:08.050 "name": "Existed_Raid", 00:14:08.050 "aliases": [ 00:14:08.050 "a1deda28-23ef-4b68-a842-6ae436b7ec9a" 00:14:08.050 ], 00:14:08.050 "product_name": "Raid Volume", 00:14:08.050 "block_size": 512, 00:14:08.050 "num_blocks": 196608, 00:14:08.050 "uuid": "a1deda28-23ef-4b68-a842-6ae436b7ec9a", 00:14:08.050 "assigned_rate_limits": { 00:14:08.050 "rw_ios_per_sec": 0, 00:14:08.050 "rw_mbytes_per_sec": 0, 00:14:08.050 "r_mbytes_per_sec": 0, 00:14:08.050 "w_mbytes_per_sec": 0 00:14:08.050 }, 00:14:08.050 "claimed": false, 00:14:08.050 "zoned": false, 00:14:08.050 "supported_io_types": { 00:14:08.050 "read": true, 00:14:08.050 "write": true, 00:14:08.051 "unmap": true, 00:14:08.051 "flush": true, 00:14:08.051 "reset": true, 00:14:08.051 "nvme_admin": false, 00:14:08.051 "nvme_io": false, 00:14:08.051 "nvme_io_md": false, 00:14:08.051 "write_zeroes": true, 00:14:08.051 "zcopy": false, 00:14:08.051 "get_zone_info": false, 00:14:08.051 "zone_management": false, 00:14:08.051 "zone_append": false, 00:14:08.051 "compare": false, 00:14:08.051 "compare_and_write": false, 00:14:08.051 "abort": false, 00:14:08.051 "seek_hole": false, 00:14:08.051 "seek_data": false, 00:14:08.051 "copy": false, 00:14:08.051 "nvme_iov_md": false 00:14:08.051 }, 00:14:08.051 "memory_domains": [ 00:14:08.051 { 00:14:08.051 "dma_device_id": "system", 00:14:08.051 "dma_device_type": 1 00:14:08.051 }, 00:14:08.051 { 00:14:08.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.051 "dma_device_type": 2 00:14:08.051 }, 00:14:08.051 { 00:14:08.051 "dma_device_id": "system", 00:14:08.051 "dma_device_type": 1 00:14:08.051 }, 00:14:08.051 { 00:14:08.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.051 "dma_device_type": 2 00:14:08.051 }, 00:14:08.051 { 00:14:08.051 "dma_device_id": "system", 00:14:08.051 "dma_device_type": 1 00:14:08.051 }, 00:14:08.051 { 00:14:08.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.051 "dma_device_type": 2 00:14:08.051 } 00:14:08.051 ], 00:14:08.051 "driver_specific": { 00:14:08.051 "raid": { 00:14:08.051 "uuid": "a1deda28-23ef-4b68-a842-6ae436b7ec9a", 00:14:08.051 "strip_size_kb": 64, 00:14:08.051 "state": "online", 00:14:08.051 "raid_level": "concat", 00:14:08.051 "superblock": false, 00:14:08.051 "num_base_bdevs": 3, 00:14:08.051 "num_base_bdevs_discovered": 3, 00:14:08.051 "num_base_bdevs_operational": 3, 00:14:08.051 "base_bdevs_list": [ 00:14:08.051 { 00:14:08.051 "name": "NewBaseBdev", 00:14:08.051 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:08.051 "is_configured": true, 00:14:08.051 "data_offset": 0, 00:14:08.051 "data_size": 65536 00:14:08.051 }, 00:14:08.051 { 00:14:08.051 "name": "BaseBdev2", 00:14:08.051 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:14:08.051 "is_configured": true, 00:14:08.051 "data_offset": 0, 00:14:08.051 "data_size": 65536 00:14:08.051 }, 00:14:08.051 { 00:14:08.051 "name": "BaseBdev3", 00:14:08.051 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:14:08.051 "is_configured": true, 00:14:08.051 "data_offset": 0, 00:14:08.051 "data_size": 65536 00:14:08.051 } 00:14:08.051 ] 00:14:08.051 } 00:14:08.051 } 00:14:08.051 }' 00:14:08.051 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:08.051 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:08.051 BaseBdev2 00:14:08.051 BaseBdev3' 00:14:08.051 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:08.311 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:08.311 07:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:08.311 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:08.311 "name": "NewBaseBdev", 00:14:08.311 "aliases": [ 00:14:08.311 "738cf5c0-dc66-41ce-aaa6-b1f72208f98c" 00:14:08.311 ], 00:14:08.311 "product_name": "Malloc disk", 00:14:08.311 "block_size": 512, 00:14:08.311 "num_blocks": 65536, 00:14:08.311 "uuid": "738cf5c0-dc66-41ce-aaa6-b1f72208f98c", 00:14:08.311 "assigned_rate_limits": { 00:14:08.311 "rw_ios_per_sec": 0, 00:14:08.312 "rw_mbytes_per_sec": 0, 00:14:08.312 "r_mbytes_per_sec": 0, 00:14:08.312 "w_mbytes_per_sec": 0 00:14:08.312 }, 00:14:08.312 "claimed": true, 00:14:08.312 "claim_type": "exclusive_write", 00:14:08.312 "zoned": false, 00:14:08.312 "supported_io_types": { 00:14:08.312 "read": true, 00:14:08.312 "write": true, 00:14:08.312 "unmap": true, 00:14:08.312 "flush": true, 00:14:08.312 "reset": true, 00:14:08.312 "nvme_admin": false, 00:14:08.312 "nvme_io": false, 00:14:08.312 "nvme_io_md": false, 00:14:08.312 "write_zeroes": true, 00:14:08.312 "zcopy": true, 00:14:08.312 "get_zone_info": false, 00:14:08.312 "zone_management": false, 00:14:08.312 "zone_append": false, 00:14:08.312 "compare": false, 00:14:08.312 "compare_and_write": false, 00:14:08.312 "abort": true, 00:14:08.312 "seek_hole": false, 00:14:08.312 "seek_data": false, 00:14:08.312 "copy": true, 00:14:08.312 "nvme_iov_md": false 00:14:08.312 }, 00:14:08.312 "memory_domains": [ 00:14:08.312 { 00:14:08.312 "dma_device_id": "system", 00:14:08.312 "dma_device_type": 1 00:14:08.312 }, 00:14:08.312 { 00:14:08.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.312 "dma_device_type": 2 00:14:08.312 } 00:14:08.312 ], 00:14:08.312 "driver_specific": {} 00:14:08.312 }' 00:14:08.312 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.312 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.574 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.872 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:08.872 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:08.872 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:08.872 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:08.872 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:08.872 "name": "BaseBdev2", 00:14:08.872 "aliases": [ 00:14:08.872 "4ee22017-db3c-45df-92f1-b889d796fbae" 00:14:08.872 ], 00:14:08.872 "product_name": "Malloc disk", 00:14:08.872 "block_size": 512, 00:14:08.872 "num_blocks": 65536, 00:14:08.872 "uuid": "4ee22017-db3c-45df-92f1-b889d796fbae", 00:14:08.872 "assigned_rate_limits": { 00:14:08.872 "rw_ios_per_sec": 0, 00:14:08.872 "rw_mbytes_per_sec": 0, 00:14:08.872 "r_mbytes_per_sec": 0, 00:14:08.872 "w_mbytes_per_sec": 0 00:14:08.872 }, 00:14:08.872 "claimed": true, 00:14:08.872 "claim_type": "exclusive_write", 00:14:08.872 "zoned": false, 00:14:08.872 "supported_io_types": { 00:14:08.872 "read": true, 00:14:08.872 "write": true, 00:14:08.872 "unmap": true, 00:14:08.872 "flush": true, 00:14:08.872 "reset": true, 00:14:08.872 "nvme_admin": false, 00:14:08.872 "nvme_io": false, 00:14:08.872 "nvme_io_md": false, 00:14:08.872 "write_zeroes": true, 00:14:08.872 "zcopy": true, 00:14:08.872 "get_zone_info": false, 00:14:08.872 "zone_management": false, 00:14:08.872 "zone_append": false, 00:14:08.872 "compare": false, 00:14:08.872 "compare_and_write": false, 00:14:08.872 "abort": true, 00:14:08.872 "seek_hole": false, 00:14:08.872 "seek_data": false, 00:14:08.872 "copy": true, 00:14:08.872 "nvme_iov_md": false 00:14:08.872 }, 00:14:08.872 "memory_domains": [ 00:14:08.872 { 00:14:08.872 "dma_device_id": "system", 00:14:08.872 "dma_device_type": 1 00:14:08.872 }, 00:14:08.872 { 00:14:08.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.872 "dma_device_type": 2 00:14:08.872 } 00:14:08.872 ], 00:14:08.872 "driver_specific": {} 00:14:08.872 }' 00:14:08.872 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.133 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.393 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:09.393 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:09.393 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:09.393 07:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:09.393 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:09.393 "name": "BaseBdev3", 00:14:09.393 "aliases": [ 00:14:09.393 "145d4c01-66fc-4efd-b40d-edd3884c866a" 00:14:09.393 ], 00:14:09.393 "product_name": "Malloc disk", 00:14:09.393 "block_size": 512, 00:14:09.393 "num_blocks": 65536, 00:14:09.393 "uuid": "145d4c01-66fc-4efd-b40d-edd3884c866a", 00:14:09.393 "assigned_rate_limits": { 00:14:09.393 "rw_ios_per_sec": 0, 00:14:09.393 "rw_mbytes_per_sec": 0, 00:14:09.393 "r_mbytes_per_sec": 0, 00:14:09.393 "w_mbytes_per_sec": 0 00:14:09.393 }, 00:14:09.393 "claimed": true, 00:14:09.393 "claim_type": "exclusive_write", 00:14:09.393 "zoned": false, 00:14:09.393 "supported_io_types": { 00:14:09.393 "read": true, 00:14:09.393 "write": true, 00:14:09.393 "unmap": true, 00:14:09.393 "flush": true, 00:14:09.393 "reset": true, 00:14:09.393 "nvme_admin": false, 00:14:09.393 "nvme_io": false, 00:14:09.393 "nvme_io_md": false, 00:14:09.393 "write_zeroes": true, 00:14:09.393 "zcopy": true, 00:14:09.393 "get_zone_info": false, 00:14:09.393 "zone_management": false, 00:14:09.393 "zone_append": false, 00:14:09.393 "compare": false, 00:14:09.393 "compare_and_write": false, 00:14:09.393 "abort": true, 00:14:09.393 "seek_hole": false, 00:14:09.393 "seek_data": false, 00:14:09.393 "copy": true, 00:14:09.393 "nvme_iov_md": false 00:14:09.393 }, 00:14:09.393 "memory_domains": [ 00:14:09.393 { 00:14:09.393 "dma_device_id": "system", 00:14:09.393 "dma_device_type": 1 00:14:09.393 }, 00:14:09.393 { 00:14:09.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.393 "dma_device_type": 2 00:14:09.393 } 00:14:09.393 ], 00:14:09.393 "driver_specific": {} 00:14:09.393 }' 00:14:09.393 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:09.652 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.912 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:09.912 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:09.912 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:09.912 [2024-07-15 07:49:54.647191] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:09.912 [2024-07-15 07:49:54.647212] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:09.912 [2024-07-15 07:49:54.647248] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:09.912 [2024-07-15 07:49:54.647283] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:09.912 [2024-07-15 07:49:54.647289] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1751250 name Existed_Raid, state offline 00:14:09.912 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1623469 00:14:09.912 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1623469 ']' 00:14:09.912 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1623469 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1623469 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1623469' 00:14:10.172 killing process with pid 1623469 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1623469 00:14:10.172 [2024-07-15 07:49:54.707140] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1623469 00:14:10.172 [2024-07-15 07:49:54.721683] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:10.172 00:14:10.172 real 0m26.454s 00:14:10.172 user 0m49.654s 00:14:10.172 sys 0m3.768s 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.172 ************************************ 00:14:10.172 END TEST raid_state_function_test 00:14:10.172 ************************************ 00:14:10.172 07:49:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:10.172 07:49:54 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:14:10.172 07:49:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:10.172 07:49:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:10.172 07:49:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:10.172 ************************************ 00:14:10.172 START TEST raid_state_function_test_sb 00:14:10.172 ************************************ 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:10.172 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:10.444 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1628418 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1628418' 00:14:10.445 Process raid pid: 1628418 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1628418 /var/tmp/spdk-raid.sock 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1628418 ']' 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:10.445 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:10.445 07:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:10.445 [2024-07-15 07:49:54.986287] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:14:10.445 [2024-07-15 07:49:54.986342] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:10.445 [2024-07-15 07:49:55.077926] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:10.445 [2024-07-15 07:49:55.145752] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:10.445 [2024-07-15 07:49:55.187558] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:10.445 [2024-07-15 07:49:55.187573] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:11.384 07:49:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:11.384 07:49:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:11.384 07:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:11.384 [2024-07-15 07:49:56.003349] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:11.384 [2024-07-15 07:49:56.003378] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:11.384 [2024-07-15 07:49:56.003384] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:11.384 [2024-07-15 07:49:56.003391] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:11.384 [2024-07-15 07:49:56.003396] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:11.384 [2024-07-15 07:49:56.003401] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.384 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.644 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.644 "name": "Existed_Raid", 00:14:11.644 "uuid": "d61ff60d-ee05-4605-9a81-c02b99118d8f", 00:14:11.644 "strip_size_kb": 64, 00:14:11.644 "state": "configuring", 00:14:11.644 "raid_level": "concat", 00:14:11.644 "superblock": true, 00:14:11.644 "num_base_bdevs": 3, 00:14:11.644 "num_base_bdevs_discovered": 0, 00:14:11.644 "num_base_bdevs_operational": 3, 00:14:11.644 "base_bdevs_list": [ 00:14:11.644 { 00:14:11.644 "name": "BaseBdev1", 00:14:11.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.644 "is_configured": false, 00:14:11.644 "data_offset": 0, 00:14:11.644 "data_size": 0 00:14:11.644 }, 00:14:11.644 { 00:14:11.644 "name": "BaseBdev2", 00:14:11.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.644 "is_configured": false, 00:14:11.644 "data_offset": 0, 00:14:11.644 "data_size": 0 00:14:11.644 }, 00:14:11.644 { 00:14:11.644 "name": "BaseBdev3", 00:14:11.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.644 "is_configured": false, 00:14:11.644 "data_offset": 0, 00:14:11.644 "data_size": 0 00:14:11.644 } 00:14:11.644 ] 00:14:11.644 }' 00:14:11.644 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.644 07:49:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:12.213 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:12.213 [2024-07-15 07:49:56.933584] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:12.213 [2024-07-15 07:49:56.933600] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11af6d0 name Existed_Raid, state configuring 00:14:12.213 07:49:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:12.473 [2024-07-15 07:49:57.130117] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:12.473 [2024-07-15 07:49:57.130138] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:12.473 [2024-07-15 07:49:57.130143] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:12.473 [2024-07-15 07:49:57.130149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:12.473 [2024-07-15 07:49:57.130154] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:12.473 [2024-07-15 07:49:57.130160] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:12.473 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:12.734 [2024-07-15 07:49:57.349058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:12.734 BaseBdev1 00:14:12.734 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:12.734 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:12.734 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:12.734 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:12.734 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:12.734 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:12.734 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:12.995 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:13.256 [ 00:14:13.256 { 00:14:13.256 "name": "BaseBdev1", 00:14:13.256 "aliases": [ 00:14:13.256 "74ed814c-0ced-4f13-8eaf-d64c2f36479c" 00:14:13.256 ], 00:14:13.256 "product_name": "Malloc disk", 00:14:13.256 "block_size": 512, 00:14:13.256 "num_blocks": 65536, 00:14:13.256 "uuid": "74ed814c-0ced-4f13-8eaf-d64c2f36479c", 00:14:13.256 "assigned_rate_limits": { 00:14:13.256 "rw_ios_per_sec": 0, 00:14:13.256 "rw_mbytes_per_sec": 0, 00:14:13.256 "r_mbytes_per_sec": 0, 00:14:13.256 "w_mbytes_per_sec": 0 00:14:13.256 }, 00:14:13.256 "claimed": true, 00:14:13.256 "claim_type": "exclusive_write", 00:14:13.256 "zoned": false, 00:14:13.256 "supported_io_types": { 00:14:13.256 "read": true, 00:14:13.256 "write": true, 00:14:13.256 "unmap": true, 00:14:13.256 "flush": true, 00:14:13.256 "reset": true, 00:14:13.256 "nvme_admin": false, 00:14:13.256 "nvme_io": false, 00:14:13.256 "nvme_io_md": false, 00:14:13.256 "write_zeroes": true, 00:14:13.256 "zcopy": true, 00:14:13.256 "get_zone_info": false, 00:14:13.256 "zone_management": false, 00:14:13.256 "zone_append": false, 00:14:13.256 "compare": false, 00:14:13.256 "compare_and_write": false, 00:14:13.256 "abort": true, 00:14:13.256 "seek_hole": false, 00:14:13.256 "seek_data": false, 00:14:13.256 "copy": true, 00:14:13.256 "nvme_iov_md": false 00:14:13.256 }, 00:14:13.256 "memory_domains": [ 00:14:13.256 { 00:14:13.256 "dma_device_id": "system", 00:14:13.256 "dma_device_type": 1 00:14:13.256 }, 00:14:13.256 { 00:14:13.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.256 "dma_device_type": 2 00:14:13.256 } 00:14:13.256 ], 00:14:13.256 "driver_specific": {} 00:14:13.256 } 00:14:13.256 ] 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.256 "name": "Existed_Raid", 00:14:13.256 "uuid": "7788c2a1-d016-4776-99bb-cfada2af3bc9", 00:14:13.256 "strip_size_kb": 64, 00:14:13.256 "state": "configuring", 00:14:13.256 "raid_level": "concat", 00:14:13.256 "superblock": true, 00:14:13.256 "num_base_bdevs": 3, 00:14:13.256 "num_base_bdevs_discovered": 1, 00:14:13.256 "num_base_bdevs_operational": 3, 00:14:13.256 "base_bdevs_list": [ 00:14:13.256 { 00:14:13.256 "name": "BaseBdev1", 00:14:13.256 "uuid": "74ed814c-0ced-4f13-8eaf-d64c2f36479c", 00:14:13.256 "is_configured": true, 00:14:13.256 "data_offset": 2048, 00:14:13.256 "data_size": 63488 00:14:13.256 }, 00:14:13.256 { 00:14:13.256 "name": "BaseBdev2", 00:14:13.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.256 "is_configured": false, 00:14:13.256 "data_offset": 0, 00:14:13.256 "data_size": 0 00:14:13.256 }, 00:14:13.256 { 00:14:13.256 "name": "BaseBdev3", 00:14:13.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.256 "is_configured": false, 00:14:13.256 "data_offset": 0, 00:14:13.256 "data_size": 0 00:14:13.256 } 00:14:13.256 ] 00:14:13.256 }' 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.256 07:49:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:13.827 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:14.088 [2024-07-15 07:49:58.700469] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:14.088 [2024-07-15 07:49:58.700493] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11aefa0 name Existed_Raid, state configuring 00:14:14.088 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:14.349 [2024-07-15 07:49:58.892992] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:14.349 [2024-07-15 07:49:58.894117] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:14.349 [2024-07-15 07:49:58.894139] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:14.349 [2024-07-15 07:49:58.894145] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:14.349 [2024-07-15 07:49:58.894151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.349 07:49:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:14.610 07:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:14.610 "name": "Existed_Raid", 00:14:14.610 "uuid": "2f7de1fa-4dfd-4115-8b53-6ac61a8dd728", 00:14:14.610 "strip_size_kb": 64, 00:14:14.610 "state": "configuring", 00:14:14.610 "raid_level": "concat", 00:14:14.610 "superblock": true, 00:14:14.610 "num_base_bdevs": 3, 00:14:14.610 "num_base_bdevs_discovered": 1, 00:14:14.610 "num_base_bdevs_operational": 3, 00:14:14.610 "base_bdevs_list": [ 00:14:14.610 { 00:14:14.610 "name": "BaseBdev1", 00:14:14.610 "uuid": "74ed814c-0ced-4f13-8eaf-d64c2f36479c", 00:14:14.610 "is_configured": true, 00:14:14.610 "data_offset": 2048, 00:14:14.610 "data_size": 63488 00:14:14.610 }, 00:14:14.610 { 00:14:14.610 "name": "BaseBdev2", 00:14:14.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:14.610 "is_configured": false, 00:14:14.610 "data_offset": 0, 00:14:14.610 "data_size": 0 00:14:14.610 }, 00:14:14.610 { 00:14:14.610 "name": "BaseBdev3", 00:14:14.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:14.610 "is_configured": false, 00:14:14.610 "data_offset": 0, 00:14:14.610 "data_size": 0 00:14:14.610 } 00:14:14.610 ] 00:14:14.610 }' 00:14:14.610 07:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:14.610 07:49:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:15.181 07:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:15.181 [2024-07-15 07:49:59.880279] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:15.181 BaseBdev2 00:14:15.181 07:49:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:15.181 07:49:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:15.181 07:49:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:15.181 07:49:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:15.181 07:49:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:15.181 07:49:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:15.181 07:49:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:15.442 07:50:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:15.702 [ 00:14:15.702 { 00:14:15.702 "name": "BaseBdev2", 00:14:15.702 "aliases": [ 00:14:15.702 "a1e78743-1969-4209-9800-f8887ec84d76" 00:14:15.702 ], 00:14:15.702 "product_name": "Malloc disk", 00:14:15.702 "block_size": 512, 00:14:15.702 "num_blocks": 65536, 00:14:15.702 "uuid": "a1e78743-1969-4209-9800-f8887ec84d76", 00:14:15.702 "assigned_rate_limits": { 00:14:15.702 "rw_ios_per_sec": 0, 00:14:15.702 "rw_mbytes_per_sec": 0, 00:14:15.702 "r_mbytes_per_sec": 0, 00:14:15.702 "w_mbytes_per_sec": 0 00:14:15.702 }, 00:14:15.702 "claimed": true, 00:14:15.702 "claim_type": "exclusive_write", 00:14:15.702 "zoned": false, 00:14:15.702 "supported_io_types": { 00:14:15.702 "read": true, 00:14:15.702 "write": true, 00:14:15.702 "unmap": true, 00:14:15.702 "flush": true, 00:14:15.702 "reset": true, 00:14:15.702 "nvme_admin": false, 00:14:15.702 "nvme_io": false, 00:14:15.702 "nvme_io_md": false, 00:14:15.702 "write_zeroes": true, 00:14:15.702 "zcopy": true, 00:14:15.702 "get_zone_info": false, 00:14:15.702 "zone_management": false, 00:14:15.702 "zone_append": false, 00:14:15.702 "compare": false, 00:14:15.702 "compare_and_write": false, 00:14:15.702 "abort": true, 00:14:15.702 "seek_hole": false, 00:14:15.702 "seek_data": false, 00:14:15.702 "copy": true, 00:14:15.702 "nvme_iov_md": false 00:14:15.702 }, 00:14:15.702 "memory_domains": [ 00:14:15.702 { 00:14:15.702 "dma_device_id": "system", 00:14:15.702 "dma_device_type": 1 00:14:15.702 }, 00:14:15.702 { 00:14:15.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.702 "dma_device_type": 2 00:14:15.702 } 00:14:15.702 ], 00:14:15.702 "driver_specific": {} 00:14:15.702 } 00:14:15.702 ] 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.702 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.962 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.962 "name": "Existed_Raid", 00:14:15.962 "uuid": "2f7de1fa-4dfd-4115-8b53-6ac61a8dd728", 00:14:15.962 "strip_size_kb": 64, 00:14:15.962 "state": "configuring", 00:14:15.962 "raid_level": "concat", 00:14:15.962 "superblock": true, 00:14:15.962 "num_base_bdevs": 3, 00:14:15.962 "num_base_bdevs_discovered": 2, 00:14:15.962 "num_base_bdevs_operational": 3, 00:14:15.962 "base_bdevs_list": [ 00:14:15.962 { 00:14:15.962 "name": "BaseBdev1", 00:14:15.962 "uuid": "74ed814c-0ced-4f13-8eaf-d64c2f36479c", 00:14:15.962 "is_configured": true, 00:14:15.962 "data_offset": 2048, 00:14:15.962 "data_size": 63488 00:14:15.962 }, 00:14:15.962 { 00:14:15.962 "name": "BaseBdev2", 00:14:15.963 "uuid": "a1e78743-1969-4209-9800-f8887ec84d76", 00:14:15.963 "is_configured": true, 00:14:15.963 "data_offset": 2048, 00:14:15.963 "data_size": 63488 00:14:15.963 }, 00:14:15.963 { 00:14:15.963 "name": "BaseBdev3", 00:14:15.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.963 "is_configured": false, 00:14:15.963 "data_offset": 0, 00:14:15.963 "data_size": 0 00:14:15.963 } 00:14:15.963 ] 00:14:15.963 }' 00:14:15.963 07:50:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.963 07:50:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:16.535 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:16.535 [2024-07-15 07:50:01.208497] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:16.535 [2024-07-15 07:50:01.208616] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11afe90 00:14:16.535 [2024-07-15 07:50:01.208624] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:16.535 [2024-07-15 07:50:01.208767] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11afb60 00:14:16.535 [2024-07-15 07:50:01.208859] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11afe90 00:14:16.535 [2024-07-15 07:50:01.208864] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11afe90 00:14:16.535 [2024-07-15 07:50:01.208930] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:16.535 BaseBdev3 00:14:16.535 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:16.535 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:16.535 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:16.535 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:16.535 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:16.535 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:16.535 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:16.795 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:17.055 [ 00:14:17.055 { 00:14:17.055 "name": "BaseBdev3", 00:14:17.055 "aliases": [ 00:14:17.055 "18bfbffb-0945-447e-bbd2-06ad23b736bd" 00:14:17.055 ], 00:14:17.055 "product_name": "Malloc disk", 00:14:17.055 "block_size": 512, 00:14:17.055 "num_blocks": 65536, 00:14:17.055 "uuid": "18bfbffb-0945-447e-bbd2-06ad23b736bd", 00:14:17.055 "assigned_rate_limits": { 00:14:17.055 "rw_ios_per_sec": 0, 00:14:17.055 "rw_mbytes_per_sec": 0, 00:14:17.055 "r_mbytes_per_sec": 0, 00:14:17.055 "w_mbytes_per_sec": 0 00:14:17.055 }, 00:14:17.055 "claimed": true, 00:14:17.055 "claim_type": "exclusive_write", 00:14:17.055 "zoned": false, 00:14:17.055 "supported_io_types": { 00:14:17.055 "read": true, 00:14:17.055 "write": true, 00:14:17.055 "unmap": true, 00:14:17.055 "flush": true, 00:14:17.055 "reset": true, 00:14:17.055 "nvme_admin": false, 00:14:17.055 "nvme_io": false, 00:14:17.055 "nvme_io_md": false, 00:14:17.055 "write_zeroes": true, 00:14:17.055 "zcopy": true, 00:14:17.055 "get_zone_info": false, 00:14:17.055 "zone_management": false, 00:14:17.055 "zone_append": false, 00:14:17.055 "compare": false, 00:14:17.055 "compare_and_write": false, 00:14:17.055 "abort": true, 00:14:17.055 "seek_hole": false, 00:14:17.055 "seek_data": false, 00:14:17.055 "copy": true, 00:14:17.055 "nvme_iov_md": false 00:14:17.055 }, 00:14:17.055 "memory_domains": [ 00:14:17.055 { 00:14:17.055 "dma_device_id": "system", 00:14:17.055 "dma_device_type": 1 00:14:17.055 }, 00:14:17.055 { 00:14:17.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.055 "dma_device_type": 2 00:14:17.055 } 00:14:17.055 ], 00:14:17.055 "driver_specific": {} 00:14:17.055 } 00:14:17.055 ] 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.055 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:17.316 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.316 "name": "Existed_Raid", 00:14:17.316 "uuid": "2f7de1fa-4dfd-4115-8b53-6ac61a8dd728", 00:14:17.316 "strip_size_kb": 64, 00:14:17.316 "state": "online", 00:14:17.316 "raid_level": "concat", 00:14:17.316 "superblock": true, 00:14:17.316 "num_base_bdevs": 3, 00:14:17.316 "num_base_bdevs_discovered": 3, 00:14:17.316 "num_base_bdevs_operational": 3, 00:14:17.316 "base_bdevs_list": [ 00:14:17.316 { 00:14:17.316 "name": "BaseBdev1", 00:14:17.316 "uuid": "74ed814c-0ced-4f13-8eaf-d64c2f36479c", 00:14:17.316 "is_configured": true, 00:14:17.316 "data_offset": 2048, 00:14:17.316 "data_size": 63488 00:14:17.316 }, 00:14:17.316 { 00:14:17.316 "name": "BaseBdev2", 00:14:17.316 "uuid": "a1e78743-1969-4209-9800-f8887ec84d76", 00:14:17.316 "is_configured": true, 00:14:17.316 "data_offset": 2048, 00:14:17.316 "data_size": 63488 00:14:17.316 }, 00:14:17.316 { 00:14:17.316 "name": "BaseBdev3", 00:14:17.316 "uuid": "18bfbffb-0945-447e-bbd2-06ad23b736bd", 00:14:17.316 "is_configured": true, 00:14:17.316 "data_offset": 2048, 00:14:17.316 "data_size": 63488 00:14:17.316 } 00:14:17.316 ] 00:14:17.316 }' 00:14:17.316 07:50:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.316 07:50:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:17.887 [2024-07-15 07:50:02.576193] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:17.887 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:17.887 "name": "Existed_Raid", 00:14:17.887 "aliases": [ 00:14:17.887 "2f7de1fa-4dfd-4115-8b53-6ac61a8dd728" 00:14:17.887 ], 00:14:17.887 "product_name": "Raid Volume", 00:14:17.887 "block_size": 512, 00:14:17.887 "num_blocks": 190464, 00:14:17.887 "uuid": "2f7de1fa-4dfd-4115-8b53-6ac61a8dd728", 00:14:17.887 "assigned_rate_limits": { 00:14:17.887 "rw_ios_per_sec": 0, 00:14:17.887 "rw_mbytes_per_sec": 0, 00:14:17.887 "r_mbytes_per_sec": 0, 00:14:17.887 "w_mbytes_per_sec": 0 00:14:17.887 }, 00:14:17.887 "claimed": false, 00:14:17.887 "zoned": false, 00:14:17.887 "supported_io_types": { 00:14:17.887 "read": true, 00:14:17.887 "write": true, 00:14:17.887 "unmap": true, 00:14:17.887 "flush": true, 00:14:17.887 "reset": true, 00:14:17.887 "nvme_admin": false, 00:14:17.887 "nvme_io": false, 00:14:17.887 "nvme_io_md": false, 00:14:17.887 "write_zeroes": true, 00:14:17.887 "zcopy": false, 00:14:17.887 "get_zone_info": false, 00:14:17.887 "zone_management": false, 00:14:17.887 "zone_append": false, 00:14:17.887 "compare": false, 00:14:17.887 "compare_and_write": false, 00:14:17.887 "abort": false, 00:14:17.887 "seek_hole": false, 00:14:17.887 "seek_data": false, 00:14:17.887 "copy": false, 00:14:17.887 "nvme_iov_md": false 00:14:17.887 }, 00:14:17.887 "memory_domains": [ 00:14:17.887 { 00:14:17.887 "dma_device_id": "system", 00:14:17.887 "dma_device_type": 1 00:14:17.887 }, 00:14:17.887 { 00:14:17.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.887 "dma_device_type": 2 00:14:17.887 }, 00:14:17.887 { 00:14:17.887 "dma_device_id": "system", 00:14:17.887 "dma_device_type": 1 00:14:17.887 }, 00:14:17.887 { 00:14:17.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.887 "dma_device_type": 2 00:14:17.887 }, 00:14:17.887 { 00:14:17.887 "dma_device_id": "system", 00:14:17.887 "dma_device_type": 1 00:14:17.887 }, 00:14:17.887 { 00:14:17.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.887 "dma_device_type": 2 00:14:17.887 } 00:14:17.887 ], 00:14:17.887 "driver_specific": { 00:14:17.887 "raid": { 00:14:17.887 "uuid": "2f7de1fa-4dfd-4115-8b53-6ac61a8dd728", 00:14:17.887 "strip_size_kb": 64, 00:14:17.887 "state": "online", 00:14:17.887 "raid_level": "concat", 00:14:17.887 "superblock": true, 00:14:17.887 "num_base_bdevs": 3, 00:14:17.887 "num_base_bdevs_discovered": 3, 00:14:17.887 "num_base_bdevs_operational": 3, 00:14:17.887 "base_bdevs_list": [ 00:14:17.887 { 00:14:17.887 "name": "BaseBdev1", 00:14:17.888 "uuid": "74ed814c-0ced-4f13-8eaf-d64c2f36479c", 00:14:17.888 "is_configured": true, 00:14:17.888 "data_offset": 2048, 00:14:17.888 "data_size": 63488 00:14:17.888 }, 00:14:17.888 { 00:14:17.888 "name": "BaseBdev2", 00:14:17.888 "uuid": "a1e78743-1969-4209-9800-f8887ec84d76", 00:14:17.888 "is_configured": true, 00:14:17.888 "data_offset": 2048, 00:14:17.888 "data_size": 63488 00:14:17.888 }, 00:14:17.888 { 00:14:17.888 "name": "BaseBdev3", 00:14:17.888 "uuid": "18bfbffb-0945-447e-bbd2-06ad23b736bd", 00:14:17.888 "is_configured": true, 00:14:17.888 "data_offset": 2048, 00:14:17.888 "data_size": 63488 00:14:17.888 } 00:14:17.888 ] 00:14:17.888 } 00:14:17.888 } 00:14:17.888 }' 00:14:17.888 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:18.148 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:18.148 BaseBdev2 00:14:18.148 BaseBdev3' 00:14:18.148 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.148 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:18.148 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.148 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.148 "name": "BaseBdev1", 00:14:18.148 "aliases": [ 00:14:18.148 "74ed814c-0ced-4f13-8eaf-d64c2f36479c" 00:14:18.148 ], 00:14:18.148 "product_name": "Malloc disk", 00:14:18.148 "block_size": 512, 00:14:18.148 "num_blocks": 65536, 00:14:18.148 "uuid": "74ed814c-0ced-4f13-8eaf-d64c2f36479c", 00:14:18.148 "assigned_rate_limits": { 00:14:18.148 "rw_ios_per_sec": 0, 00:14:18.148 "rw_mbytes_per_sec": 0, 00:14:18.148 "r_mbytes_per_sec": 0, 00:14:18.148 "w_mbytes_per_sec": 0 00:14:18.148 }, 00:14:18.148 "claimed": true, 00:14:18.148 "claim_type": "exclusive_write", 00:14:18.148 "zoned": false, 00:14:18.148 "supported_io_types": { 00:14:18.148 "read": true, 00:14:18.148 "write": true, 00:14:18.148 "unmap": true, 00:14:18.148 "flush": true, 00:14:18.148 "reset": true, 00:14:18.148 "nvme_admin": false, 00:14:18.148 "nvme_io": false, 00:14:18.148 "nvme_io_md": false, 00:14:18.148 "write_zeroes": true, 00:14:18.148 "zcopy": true, 00:14:18.148 "get_zone_info": false, 00:14:18.148 "zone_management": false, 00:14:18.148 "zone_append": false, 00:14:18.148 "compare": false, 00:14:18.148 "compare_and_write": false, 00:14:18.148 "abort": true, 00:14:18.148 "seek_hole": false, 00:14:18.148 "seek_data": false, 00:14:18.148 "copy": true, 00:14:18.148 "nvme_iov_md": false 00:14:18.148 }, 00:14:18.148 "memory_domains": [ 00:14:18.148 { 00:14:18.148 "dma_device_id": "system", 00:14:18.148 "dma_device_type": 1 00:14:18.148 }, 00:14:18.148 { 00:14:18.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.148 "dma_device_type": 2 00:14:18.148 } 00:14:18.148 ], 00:14:18.148 "driver_specific": {} 00:14:18.148 }' 00:14:18.148 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.148 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.408 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.408 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.408 07:50:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.408 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.408 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.408 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.408 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.408 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.408 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.685 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.685 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.685 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.685 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:18.685 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.685 "name": "BaseBdev2", 00:14:18.685 "aliases": [ 00:14:18.685 "a1e78743-1969-4209-9800-f8887ec84d76" 00:14:18.685 ], 00:14:18.685 "product_name": "Malloc disk", 00:14:18.685 "block_size": 512, 00:14:18.685 "num_blocks": 65536, 00:14:18.685 "uuid": "a1e78743-1969-4209-9800-f8887ec84d76", 00:14:18.685 "assigned_rate_limits": { 00:14:18.685 "rw_ios_per_sec": 0, 00:14:18.685 "rw_mbytes_per_sec": 0, 00:14:18.685 "r_mbytes_per_sec": 0, 00:14:18.685 "w_mbytes_per_sec": 0 00:14:18.685 }, 00:14:18.685 "claimed": true, 00:14:18.685 "claim_type": "exclusive_write", 00:14:18.685 "zoned": false, 00:14:18.685 "supported_io_types": { 00:14:18.685 "read": true, 00:14:18.685 "write": true, 00:14:18.686 "unmap": true, 00:14:18.686 "flush": true, 00:14:18.686 "reset": true, 00:14:18.686 "nvme_admin": false, 00:14:18.686 "nvme_io": false, 00:14:18.686 "nvme_io_md": false, 00:14:18.686 "write_zeroes": true, 00:14:18.686 "zcopy": true, 00:14:18.686 "get_zone_info": false, 00:14:18.686 "zone_management": false, 00:14:18.686 "zone_append": false, 00:14:18.686 "compare": false, 00:14:18.686 "compare_and_write": false, 00:14:18.686 "abort": true, 00:14:18.686 "seek_hole": false, 00:14:18.686 "seek_data": false, 00:14:18.686 "copy": true, 00:14:18.686 "nvme_iov_md": false 00:14:18.686 }, 00:14:18.686 "memory_domains": [ 00:14:18.686 { 00:14:18.686 "dma_device_id": "system", 00:14:18.686 "dma_device_type": 1 00:14:18.686 }, 00:14:18.686 { 00:14:18.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.686 "dma_device_type": 2 00:14:18.686 } 00:14:18.686 ], 00:14:18.686 "driver_specific": {} 00:14:18.686 }' 00:14:18.686 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.686 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:18.946 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:19.206 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:19.206 "name": "BaseBdev3", 00:14:19.206 "aliases": [ 00:14:19.206 "18bfbffb-0945-447e-bbd2-06ad23b736bd" 00:14:19.206 ], 00:14:19.206 "product_name": "Malloc disk", 00:14:19.206 "block_size": 512, 00:14:19.206 "num_blocks": 65536, 00:14:19.206 "uuid": "18bfbffb-0945-447e-bbd2-06ad23b736bd", 00:14:19.206 "assigned_rate_limits": { 00:14:19.206 "rw_ios_per_sec": 0, 00:14:19.206 "rw_mbytes_per_sec": 0, 00:14:19.206 "r_mbytes_per_sec": 0, 00:14:19.206 "w_mbytes_per_sec": 0 00:14:19.206 }, 00:14:19.206 "claimed": true, 00:14:19.206 "claim_type": "exclusive_write", 00:14:19.206 "zoned": false, 00:14:19.206 "supported_io_types": { 00:14:19.206 "read": true, 00:14:19.206 "write": true, 00:14:19.206 "unmap": true, 00:14:19.206 "flush": true, 00:14:19.206 "reset": true, 00:14:19.206 "nvme_admin": false, 00:14:19.206 "nvme_io": false, 00:14:19.206 "nvme_io_md": false, 00:14:19.206 "write_zeroes": true, 00:14:19.206 "zcopy": true, 00:14:19.206 "get_zone_info": false, 00:14:19.206 "zone_management": false, 00:14:19.206 "zone_append": false, 00:14:19.206 "compare": false, 00:14:19.206 "compare_and_write": false, 00:14:19.206 "abort": true, 00:14:19.206 "seek_hole": false, 00:14:19.206 "seek_data": false, 00:14:19.206 "copy": true, 00:14:19.206 "nvme_iov_md": false 00:14:19.206 }, 00:14:19.206 "memory_domains": [ 00:14:19.206 { 00:14:19.206 "dma_device_id": "system", 00:14:19.206 "dma_device_type": 1 00:14:19.206 }, 00:14:19.206 { 00:14:19.206 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.206 "dma_device_type": 2 00:14:19.206 } 00:14:19.206 ], 00:14:19.206 "driver_specific": {} 00:14:19.206 }' 00:14:19.206 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.206 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.466 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:19.466 07:50:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:19.466 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:19.726 [2024-07-15 07:50:04.392583] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:19.726 [2024-07-15 07:50:04.392600] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:19.726 [2024-07-15 07:50:04.392628] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.726 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:19.987 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.987 "name": "Existed_Raid", 00:14:19.987 "uuid": "2f7de1fa-4dfd-4115-8b53-6ac61a8dd728", 00:14:19.987 "strip_size_kb": 64, 00:14:19.987 "state": "offline", 00:14:19.987 "raid_level": "concat", 00:14:19.987 "superblock": true, 00:14:19.987 "num_base_bdevs": 3, 00:14:19.987 "num_base_bdevs_discovered": 2, 00:14:19.987 "num_base_bdevs_operational": 2, 00:14:19.987 "base_bdevs_list": [ 00:14:19.987 { 00:14:19.987 "name": null, 00:14:19.987 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.987 "is_configured": false, 00:14:19.987 "data_offset": 2048, 00:14:19.987 "data_size": 63488 00:14:19.987 }, 00:14:19.987 { 00:14:19.987 "name": "BaseBdev2", 00:14:19.987 "uuid": "a1e78743-1969-4209-9800-f8887ec84d76", 00:14:19.987 "is_configured": true, 00:14:19.987 "data_offset": 2048, 00:14:19.987 "data_size": 63488 00:14:19.987 }, 00:14:19.987 { 00:14:19.987 "name": "BaseBdev3", 00:14:19.987 "uuid": "18bfbffb-0945-447e-bbd2-06ad23b736bd", 00:14:19.987 "is_configured": true, 00:14:19.987 "data_offset": 2048, 00:14:19.987 "data_size": 63488 00:14:19.987 } 00:14:19.987 ] 00:14:19.987 }' 00:14:19.987 07:50:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.987 07:50:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:20.556 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:20.556 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:20.556 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.556 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:20.816 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:20.816 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:20.816 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:20.816 [2024-07-15 07:50:05.511425] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:20.816 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:20.816 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:20.816 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.816 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:21.075 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:21.075 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:21.075 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:21.336 [2024-07-15 07:50:05.894182] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:21.336 [2024-07-15 07:50:05.894213] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11afe90 name Existed_Raid, state offline 00:14:21.336 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:21.336 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:21.336 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.336 07:50:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:21.596 BaseBdev2 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:21.596 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:21.856 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:22.117 [ 00:14:22.117 { 00:14:22.117 "name": "BaseBdev2", 00:14:22.117 "aliases": [ 00:14:22.117 "cf09b7ef-4f76-4de1-98a5-7a5d008614b0" 00:14:22.117 ], 00:14:22.117 "product_name": "Malloc disk", 00:14:22.117 "block_size": 512, 00:14:22.117 "num_blocks": 65536, 00:14:22.117 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:22.117 "assigned_rate_limits": { 00:14:22.117 "rw_ios_per_sec": 0, 00:14:22.117 "rw_mbytes_per_sec": 0, 00:14:22.117 "r_mbytes_per_sec": 0, 00:14:22.117 "w_mbytes_per_sec": 0 00:14:22.117 }, 00:14:22.117 "claimed": false, 00:14:22.117 "zoned": false, 00:14:22.117 "supported_io_types": { 00:14:22.117 "read": true, 00:14:22.117 "write": true, 00:14:22.117 "unmap": true, 00:14:22.117 "flush": true, 00:14:22.117 "reset": true, 00:14:22.117 "nvme_admin": false, 00:14:22.117 "nvme_io": false, 00:14:22.117 "nvme_io_md": false, 00:14:22.117 "write_zeroes": true, 00:14:22.117 "zcopy": true, 00:14:22.117 "get_zone_info": false, 00:14:22.117 "zone_management": false, 00:14:22.117 "zone_append": false, 00:14:22.117 "compare": false, 00:14:22.117 "compare_and_write": false, 00:14:22.117 "abort": true, 00:14:22.117 "seek_hole": false, 00:14:22.117 "seek_data": false, 00:14:22.117 "copy": true, 00:14:22.117 "nvme_iov_md": false 00:14:22.117 }, 00:14:22.117 "memory_domains": [ 00:14:22.117 { 00:14:22.117 "dma_device_id": "system", 00:14:22.117 "dma_device_type": 1 00:14:22.117 }, 00:14:22.117 { 00:14:22.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.117 "dma_device_type": 2 00:14:22.117 } 00:14:22.117 ], 00:14:22.117 "driver_specific": {} 00:14:22.117 } 00:14:22.117 ] 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:22.117 BaseBdev3 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:22.117 07:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:22.378 07:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:22.639 [ 00:14:22.639 { 00:14:22.639 "name": "BaseBdev3", 00:14:22.639 "aliases": [ 00:14:22.639 "1c84c62b-6044-4b72-ab6e-593ae374a90c" 00:14:22.639 ], 00:14:22.639 "product_name": "Malloc disk", 00:14:22.639 "block_size": 512, 00:14:22.639 "num_blocks": 65536, 00:14:22.639 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:22.639 "assigned_rate_limits": { 00:14:22.639 "rw_ios_per_sec": 0, 00:14:22.639 "rw_mbytes_per_sec": 0, 00:14:22.639 "r_mbytes_per_sec": 0, 00:14:22.639 "w_mbytes_per_sec": 0 00:14:22.639 }, 00:14:22.639 "claimed": false, 00:14:22.639 "zoned": false, 00:14:22.639 "supported_io_types": { 00:14:22.639 "read": true, 00:14:22.639 "write": true, 00:14:22.639 "unmap": true, 00:14:22.639 "flush": true, 00:14:22.639 "reset": true, 00:14:22.639 "nvme_admin": false, 00:14:22.639 "nvme_io": false, 00:14:22.639 "nvme_io_md": false, 00:14:22.639 "write_zeroes": true, 00:14:22.639 "zcopy": true, 00:14:22.639 "get_zone_info": false, 00:14:22.639 "zone_management": false, 00:14:22.639 "zone_append": false, 00:14:22.639 "compare": false, 00:14:22.639 "compare_and_write": false, 00:14:22.639 "abort": true, 00:14:22.639 "seek_hole": false, 00:14:22.639 "seek_data": false, 00:14:22.639 "copy": true, 00:14:22.639 "nvme_iov_md": false 00:14:22.639 }, 00:14:22.639 "memory_domains": [ 00:14:22.639 { 00:14:22.639 "dma_device_id": "system", 00:14:22.639 "dma_device_type": 1 00:14:22.639 }, 00:14:22.639 { 00:14:22.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.639 "dma_device_type": 2 00:14:22.639 } 00:14:22.639 ], 00:14:22.639 "driver_specific": {} 00:14:22.639 } 00:14:22.639 ] 00:14:22.639 07:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:22.639 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:22.639 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:22.639 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:22.639 [2024-07-15 07:50:07.393840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:22.639 [2024-07-15 07:50:07.393867] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:22.639 [2024-07-15 07:50:07.393879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:22.900 [2024-07-15 07:50:07.394902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.900 "name": "Existed_Raid", 00:14:22.900 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:22.900 "strip_size_kb": 64, 00:14:22.900 "state": "configuring", 00:14:22.900 "raid_level": "concat", 00:14:22.900 "superblock": true, 00:14:22.900 "num_base_bdevs": 3, 00:14:22.900 "num_base_bdevs_discovered": 2, 00:14:22.900 "num_base_bdevs_operational": 3, 00:14:22.900 "base_bdevs_list": [ 00:14:22.900 { 00:14:22.900 "name": "BaseBdev1", 00:14:22.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.900 "is_configured": false, 00:14:22.900 "data_offset": 0, 00:14:22.900 "data_size": 0 00:14:22.900 }, 00:14:22.900 { 00:14:22.900 "name": "BaseBdev2", 00:14:22.900 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:22.900 "is_configured": true, 00:14:22.900 "data_offset": 2048, 00:14:22.900 "data_size": 63488 00:14:22.900 }, 00:14:22.900 { 00:14:22.900 "name": "BaseBdev3", 00:14:22.900 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:22.900 "is_configured": true, 00:14:22.900 "data_offset": 2048, 00:14:22.900 "data_size": 63488 00:14:22.900 } 00:14:22.900 ] 00:14:22.900 }' 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.900 07:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:23.471 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:23.733 [2024-07-15 07:50:08.328192] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.733 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:23.994 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.994 "name": "Existed_Raid", 00:14:23.994 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:23.994 "strip_size_kb": 64, 00:14:23.994 "state": "configuring", 00:14:23.994 "raid_level": "concat", 00:14:23.994 "superblock": true, 00:14:23.994 "num_base_bdevs": 3, 00:14:23.994 "num_base_bdevs_discovered": 1, 00:14:23.994 "num_base_bdevs_operational": 3, 00:14:23.994 "base_bdevs_list": [ 00:14:23.994 { 00:14:23.994 "name": "BaseBdev1", 00:14:23.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:23.994 "is_configured": false, 00:14:23.994 "data_offset": 0, 00:14:23.994 "data_size": 0 00:14:23.994 }, 00:14:23.994 { 00:14:23.994 "name": null, 00:14:23.994 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:23.994 "is_configured": false, 00:14:23.994 "data_offset": 2048, 00:14:23.994 "data_size": 63488 00:14:23.994 }, 00:14:23.994 { 00:14:23.994 "name": "BaseBdev3", 00:14:23.994 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:23.994 "is_configured": true, 00:14:23.994 "data_offset": 2048, 00:14:23.994 "data_size": 63488 00:14:23.994 } 00:14:23.994 ] 00:14:23.994 }' 00:14:23.994 07:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.994 07:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:24.567 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.567 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:24.567 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:24.567 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:24.848 [2024-07-15 07:50:09.448013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:24.848 BaseBdev1 00:14:24.848 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:24.849 07:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:24.849 07:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:24.849 07:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:24.849 07:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:24.849 07:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:24.849 07:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:25.109 [ 00:14:25.109 { 00:14:25.109 "name": "BaseBdev1", 00:14:25.109 "aliases": [ 00:14:25.109 "de00c889-5b1b-4443-94a2-24a618160306" 00:14:25.109 ], 00:14:25.109 "product_name": "Malloc disk", 00:14:25.109 "block_size": 512, 00:14:25.109 "num_blocks": 65536, 00:14:25.109 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:25.109 "assigned_rate_limits": { 00:14:25.109 "rw_ios_per_sec": 0, 00:14:25.109 "rw_mbytes_per_sec": 0, 00:14:25.109 "r_mbytes_per_sec": 0, 00:14:25.109 "w_mbytes_per_sec": 0 00:14:25.109 }, 00:14:25.109 "claimed": true, 00:14:25.109 "claim_type": "exclusive_write", 00:14:25.109 "zoned": false, 00:14:25.109 "supported_io_types": { 00:14:25.109 "read": true, 00:14:25.109 "write": true, 00:14:25.109 "unmap": true, 00:14:25.109 "flush": true, 00:14:25.109 "reset": true, 00:14:25.109 "nvme_admin": false, 00:14:25.109 "nvme_io": false, 00:14:25.109 "nvme_io_md": false, 00:14:25.109 "write_zeroes": true, 00:14:25.109 "zcopy": true, 00:14:25.109 "get_zone_info": false, 00:14:25.109 "zone_management": false, 00:14:25.109 "zone_append": false, 00:14:25.109 "compare": false, 00:14:25.109 "compare_and_write": false, 00:14:25.109 "abort": true, 00:14:25.109 "seek_hole": false, 00:14:25.109 "seek_data": false, 00:14:25.109 "copy": true, 00:14:25.109 "nvme_iov_md": false 00:14:25.109 }, 00:14:25.109 "memory_domains": [ 00:14:25.109 { 00:14:25.109 "dma_device_id": "system", 00:14:25.109 "dma_device_type": 1 00:14:25.109 }, 00:14:25.109 { 00:14:25.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:25.109 "dma_device_type": 2 00:14:25.109 } 00:14:25.109 ], 00:14:25.109 "driver_specific": {} 00:14:25.109 } 00:14:25.109 ] 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.109 07:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.401 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.401 "name": "Existed_Raid", 00:14:25.401 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:25.401 "strip_size_kb": 64, 00:14:25.401 "state": "configuring", 00:14:25.401 "raid_level": "concat", 00:14:25.401 "superblock": true, 00:14:25.401 "num_base_bdevs": 3, 00:14:25.401 "num_base_bdevs_discovered": 2, 00:14:25.401 "num_base_bdevs_operational": 3, 00:14:25.401 "base_bdevs_list": [ 00:14:25.401 { 00:14:25.401 "name": "BaseBdev1", 00:14:25.401 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:25.401 "is_configured": true, 00:14:25.401 "data_offset": 2048, 00:14:25.401 "data_size": 63488 00:14:25.401 }, 00:14:25.401 { 00:14:25.401 "name": null, 00:14:25.401 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:25.401 "is_configured": false, 00:14:25.401 "data_offset": 2048, 00:14:25.401 "data_size": 63488 00:14:25.401 }, 00:14:25.401 { 00:14:25.401 "name": "BaseBdev3", 00:14:25.401 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:25.401 "is_configured": true, 00:14:25.401 "data_offset": 2048, 00:14:25.401 "data_size": 63488 00:14:25.401 } 00:14:25.401 ] 00:14:25.401 }' 00:14:25.401 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.401 07:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:25.978 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.978 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:26.239 [2024-07-15 07:50:10.951832] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.239 07:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:26.499 07:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.499 "name": "Existed_Raid", 00:14:26.499 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:26.499 "strip_size_kb": 64, 00:14:26.499 "state": "configuring", 00:14:26.499 "raid_level": "concat", 00:14:26.499 "superblock": true, 00:14:26.499 "num_base_bdevs": 3, 00:14:26.499 "num_base_bdevs_discovered": 1, 00:14:26.499 "num_base_bdevs_operational": 3, 00:14:26.499 "base_bdevs_list": [ 00:14:26.499 { 00:14:26.499 "name": "BaseBdev1", 00:14:26.499 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:26.499 "is_configured": true, 00:14:26.499 "data_offset": 2048, 00:14:26.499 "data_size": 63488 00:14:26.499 }, 00:14:26.499 { 00:14:26.499 "name": null, 00:14:26.499 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:26.499 "is_configured": false, 00:14:26.499 "data_offset": 2048, 00:14:26.499 "data_size": 63488 00:14:26.499 }, 00:14:26.499 { 00:14:26.499 "name": null, 00:14:26.499 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:26.499 "is_configured": false, 00:14:26.499 "data_offset": 2048, 00:14:26.499 "data_size": 63488 00:14:26.499 } 00:14:26.499 ] 00:14:26.499 }' 00:14:26.499 07:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.499 07:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:27.070 07:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.070 07:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:27.331 07:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:27.331 07:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:27.331 [2024-07-15 07:50:12.062656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.331 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.591 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:27.591 "name": "Existed_Raid", 00:14:27.591 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:27.591 "strip_size_kb": 64, 00:14:27.591 "state": "configuring", 00:14:27.591 "raid_level": "concat", 00:14:27.591 "superblock": true, 00:14:27.591 "num_base_bdevs": 3, 00:14:27.591 "num_base_bdevs_discovered": 2, 00:14:27.591 "num_base_bdevs_operational": 3, 00:14:27.591 "base_bdevs_list": [ 00:14:27.591 { 00:14:27.591 "name": "BaseBdev1", 00:14:27.591 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:27.591 "is_configured": true, 00:14:27.591 "data_offset": 2048, 00:14:27.591 "data_size": 63488 00:14:27.591 }, 00:14:27.591 { 00:14:27.591 "name": null, 00:14:27.591 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:27.591 "is_configured": false, 00:14:27.591 "data_offset": 2048, 00:14:27.591 "data_size": 63488 00:14:27.591 }, 00:14:27.591 { 00:14:27.591 "name": "BaseBdev3", 00:14:27.591 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:27.591 "is_configured": true, 00:14:27.591 "data_offset": 2048, 00:14:27.591 "data_size": 63488 00:14:27.591 } 00:14:27.591 ] 00:14:27.591 }' 00:14:27.591 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:27.591 07:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:28.161 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.161 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:28.421 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:28.421 07:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:28.421 [2024-07-15 07:50:13.117348] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.421 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.681 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.681 "name": "Existed_Raid", 00:14:28.681 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:28.681 "strip_size_kb": 64, 00:14:28.681 "state": "configuring", 00:14:28.681 "raid_level": "concat", 00:14:28.681 "superblock": true, 00:14:28.681 "num_base_bdevs": 3, 00:14:28.681 "num_base_bdevs_discovered": 1, 00:14:28.681 "num_base_bdevs_operational": 3, 00:14:28.681 "base_bdevs_list": [ 00:14:28.681 { 00:14:28.681 "name": null, 00:14:28.681 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:28.681 "is_configured": false, 00:14:28.681 "data_offset": 2048, 00:14:28.681 "data_size": 63488 00:14:28.681 }, 00:14:28.681 { 00:14:28.681 "name": null, 00:14:28.681 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:28.681 "is_configured": false, 00:14:28.681 "data_offset": 2048, 00:14:28.681 "data_size": 63488 00:14:28.681 }, 00:14:28.681 { 00:14:28.681 "name": "BaseBdev3", 00:14:28.681 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:28.681 "is_configured": true, 00:14:28.681 "data_offset": 2048, 00:14:28.681 "data_size": 63488 00:14:28.681 } 00:14:28.681 ] 00:14:28.681 }' 00:14:28.681 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.681 07:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:29.250 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.250 07:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:29.509 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:29.510 [2024-07-15 07:50:14.233985] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.510 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:29.770 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:29.770 "name": "Existed_Raid", 00:14:29.770 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:29.770 "strip_size_kb": 64, 00:14:29.770 "state": "configuring", 00:14:29.770 "raid_level": "concat", 00:14:29.770 "superblock": true, 00:14:29.770 "num_base_bdevs": 3, 00:14:29.770 "num_base_bdevs_discovered": 2, 00:14:29.770 "num_base_bdevs_operational": 3, 00:14:29.770 "base_bdevs_list": [ 00:14:29.770 { 00:14:29.770 "name": null, 00:14:29.770 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:29.770 "is_configured": false, 00:14:29.770 "data_offset": 2048, 00:14:29.770 "data_size": 63488 00:14:29.770 }, 00:14:29.770 { 00:14:29.770 "name": "BaseBdev2", 00:14:29.770 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:29.770 "is_configured": true, 00:14:29.770 "data_offset": 2048, 00:14:29.770 "data_size": 63488 00:14:29.770 }, 00:14:29.770 { 00:14:29.770 "name": "BaseBdev3", 00:14:29.770 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:29.770 "is_configured": true, 00:14:29.770 "data_offset": 2048, 00:14:29.770 "data_size": 63488 00:14:29.770 } 00:14:29.770 ] 00:14:29.770 }' 00:14:29.770 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:29.770 07:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:30.339 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.339 07:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:30.600 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:30.600 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.600 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:30.862 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u de00c889-5b1b-4443-94a2-24a618160306 00:14:30.862 [2024-07-15 07:50:15.570245] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:30.862 [2024-07-15 07:50:15.570356] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13536f0 00:14:30.862 [2024-07-15 07:50:15.570363] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:30.862 [2024-07-15 07:50:15.570500] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11b0440 00:14:30.862 [2024-07-15 07:50:15.570586] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13536f0 00:14:30.862 [2024-07-15 07:50:15.570591] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13536f0 00:14:30.862 [2024-07-15 07:50:15.570658] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:30.862 NewBaseBdev 00:14:30.862 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:30.862 07:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:30.862 07:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:30.862 07:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:30.862 07:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:30.862 07:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:30.862 07:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:31.122 07:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:31.382 [ 00:14:31.382 { 00:14:31.382 "name": "NewBaseBdev", 00:14:31.382 "aliases": [ 00:14:31.382 "de00c889-5b1b-4443-94a2-24a618160306" 00:14:31.382 ], 00:14:31.382 "product_name": "Malloc disk", 00:14:31.382 "block_size": 512, 00:14:31.382 "num_blocks": 65536, 00:14:31.382 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:31.382 "assigned_rate_limits": { 00:14:31.382 "rw_ios_per_sec": 0, 00:14:31.382 "rw_mbytes_per_sec": 0, 00:14:31.382 "r_mbytes_per_sec": 0, 00:14:31.382 "w_mbytes_per_sec": 0 00:14:31.382 }, 00:14:31.382 "claimed": true, 00:14:31.382 "claim_type": "exclusive_write", 00:14:31.382 "zoned": false, 00:14:31.382 "supported_io_types": { 00:14:31.382 "read": true, 00:14:31.382 "write": true, 00:14:31.382 "unmap": true, 00:14:31.382 "flush": true, 00:14:31.382 "reset": true, 00:14:31.382 "nvme_admin": false, 00:14:31.382 "nvme_io": false, 00:14:31.382 "nvme_io_md": false, 00:14:31.382 "write_zeroes": true, 00:14:31.382 "zcopy": true, 00:14:31.382 "get_zone_info": false, 00:14:31.382 "zone_management": false, 00:14:31.382 "zone_append": false, 00:14:31.382 "compare": false, 00:14:31.382 "compare_and_write": false, 00:14:31.382 "abort": true, 00:14:31.383 "seek_hole": false, 00:14:31.383 "seek_data": false, 00:14:31.383 "copy": true, 00:14:31.383 "nvme_iov_md": false 00:14:31.383 }, 00:14:31.383 "memory_domains": [ 00:14:31.383 { 00:14:31.383 "dma_device_id": "system", 00:14:31.383 "dma_device_type": 1 00:14:31.383 }, 00:14:31.383 { 00:14:31.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.383 "dma_device_type": 2 00:14:31.383 } 00:14:31.383 ], 00:14:31.383 "driver_specific": {} 00:14:31.383 } 00:14:31.383 ] 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.383 07:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.644 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.644 "name": "Existed_Raid", 00:14:31.644 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:31.644 "strip_size_kb": 64, 00:14:31.644 "state": "online", 00:14:31.644 "raid_level": "concat", 00:14:31.644 "superblock": true, 00:14:31.644 "num_base_bdevs": 3, 00:14:31.644 "num_base_bdevs_discovered": 3, 00:14:31.644 "num_base_bdevs_operational": 3, 00:14:31.644 "base_bdevs_list": [ 00:14:31.644 { 00:14:31.644 "name": "NewBaseBdev", 00:14:31.644 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:31.644 "is_configured": true, 00:14:31.644 "data_offset": 2048, 00:14:31.644 "data_size": 63488 00:14:31.644 }, 00:14:31.644 { 00:14:31.644 "name": "BaseBdev2", 00:14:31.644 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:31.644 "is_configured": true, 00:14:31.644 "data_offset": 2048, 00:14:31.644 "data_size": 63488 00:14:31.644 }, 00:14:31.644 { 00:14:31.644 "name": "BaseBdev3", 00:14:31.644 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:31.644 "is_configured": true, 00:14:31.644 "data_offset": 2048, 00:14:31.644 "data_size": 63488 00:14:31.644 } 00:14:31.644 ] 00:14:31.644 }' 00:14:31.644 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.644 07:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:32.216 [2024-07-15 07:50:16.885811] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:32.216 "name": "Existed_Raid", 00:14:32.216 "aliases": [ 00:14:32.216 "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f" 00:14:32.216 ], 00:14:32.216 "product_name": "Raid Volume", 00:14:32.216 "block_size": 512, 00:14:32.216 "num_blocks": 190464, 00:14:32.216 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:32.216 "assigned_rate_limits": { 00:14:32.216 "rw_ios_per_sec": 0, 00:14:32.216 "rw_mbytes_per_sec": 0, 00:14:32.216 "r_mbytes_per_sec": 0, 00:14:32.216 "w_mbytes_per_sec": 0 00:14:32.216 }, 00:14:32.216 "claimed": false, 00:14:32.216 "zoned": false, 00:14:32.216 "supported_io_types": { 00:14:32.216 "read": true, 00:14:32.216 "write": true, 00:14:32.216 "unmap": true, 00:14:32.216 "flush": true, 00:14:32.216 "reset": true, 00:14:32.216 "nvme_admin": false, 00:14:32.216 "nvme_io": false, 00:14:32.216 "nvme_io_md": false, 00:14:32.216 "write_zeroes": true, 00:14:32.216 "zcopy": false, 00:14:32.216 "get_zone_info": false, 00:14:32.216 "zone_management": false, 00:14:32.216 "zone_append": false, 00:14:32.216 "compare": false, 00:14:32.216 "compare_and_write": false, 00:14:32.216 "abort": false, 00:14:32.216 "seek_hole": false, 00:14:32.216 "seek_data": false, 00:14:32.216 "copy": false, 00:14:32.216 "nvme_iov_md": false 00:14:32.216 }, 00:14:32.216 "memory_domains": [ 00:14:32.216 { 00:14:32.216 "dma_device_id": "system", 00:14:32.216 "dma_device_type": 1 00:14:32.216 }, 00:14:32.216 { 00:14:32.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.216 "dma_device_type": 2 00:14:32.216 }, 00:14:32.216 { 00:14:32.216 "dma_device_id": "system", 00:14:32.216 "dma_device_type": 1 00:14:32.216 }, 00:14:32.216 { 00:14:32.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.216 "dma_device_type": 2 00:14:32.216 }, 00:14:32.216 { 00:14:32.216 "dma_device_id": "system", 00:14:32.216 "dma_device_type": 1 00:14:32.216 }, 00:14:32.216 { 00:14:32.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.216 "dma_device_type": 2 00:14:32.216 } 00:14:32.216 ], 00:14:32.216 "driver_specific": { 00:14:32.216 "raid": { 00:14:32.216 "uuid": "fc39c21e-b285-4d50-bd84-a4a5cd7d5b4f", 00:14:32.216 "strip_size_kb": 64, 00:14:32.216 "state": "online", 00:14:32.216 "raid_level": "concat", 00:14:32.216 "superblock": true, 00:14:32.216 "num_base_bdevs": 3, 00:14:32.216 "num_base_bdevs_discovered": 3, 00:14:32.216 "num_base_bdevs_operational": 3, 00:14:32.216 "base_bdevs_list": [ 00:14:32.216 { 00:14:32.216 "name": "NewBaseBdev", 00:14:32.216 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:32.216 "is_configured": true, 00:14:32.216 "data_offset": 2048, 00:14:32.216 "data_size": 63488 00:14:32.216 }, 00:14:32.216 { 00:14:32.216 "name": "BaseBdev2", 00:14:32.216 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:32.216 "is_configured": true, 00:14:32.216 "data_offset": 2048, 00:14:32.216 "data_size": 63488 00:14:32.216 }, 00:14:32.216 { 00:14:32.216 "name": "BaseBdev3", 00:14:32.216 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:32.216 "is_configured": true, 00:14:32.216 "data_offset": 2048, 00:14:32.216 "data_size": 63488 00:14:32.216 } 00:14:32.216 ] 00:14:32.216 } 00:14:32.216 } 00:14:32.216 }' 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:32.216 BaseBdev2 00:14:32.216 BaseBdev3' 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:32.216 07:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:32.477 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:32.477 "name": "NewBaseBdev", 00:14:32.477 "aliases": [ 00:14:32.477 "de00c889-5b1b-4443-94a2-24a618160306" 00:14:32.477 ], 00:14:32.477 "product_name": "Malloc disk", 00:14:32.477 "block_size": 512, 00:14:32.477 "num_blocks": 65536, 00:14:32.477 "uuid": "de00c889-5b1b-4443-94a2-24a618160306", 00:14:32.477 "assigned_rate_limits": { 00:14:32.477 "rw_ios_per_sec": 0, 00:14:32.477 "rw_mbytes_per_sec": 0, 00:14:32.477 "r_mbytes_per_sec": 0, 00:14:32.477 "w_mbytes_per_sec": 0 00:14:32.477 }, 00:14:32.477 "claimed": true, 00:14:32.477 "claim_type": "exclusive_write", 00:14:32.477 "zoned": false, 00:14:32.477 "supported_io_types": { 00:14:32.477 "read": true, 00:14:32.477 "write": true, 00:14:32.477 "unmap": true, 00:14:32.477 "flush": true, 00:14:32.477 "reset": true, 00:14:32.477 "nvme_admin": false, 00:14:32.477 "nvme_io": false, 00:14:32.477 "nvme_io_md": false, 00:14:32.477 "write_zeroes": true, 00:14:32.477 "zcopy": true, 00:14:32.477 "get_zone_info": false, 00:14:32.477 "zone_management": false, 00:14:32.477 "zone_append": false, 00:14:32.477 "compare": false, 00:14:32.477 "compare_and_write": false, 00:14:32.477 "abort": true, 00:14:32.477 "seek_hole": false, 00:14:32.477 "seek_data": false, 00:14:32.477 "copy": true, 00:14:32.477 "nvme_iov_md": false 00:14:32.477 }, 00:14:32.477 "memory_domains": [ 00:14:32.477 { 00:14:32.478 "dma_device_id": "system", 00:14:32.478 "dma_device_type": 1 00:14:32.478 }, 00:14:32.478 { 00:14:32.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.478 "dma_device_type": 2 00:14:32.478 } 00:14:32.478 ], 00:14:32.478 "driver_specific": {} 00:14:32.478 }' 00:14:32.478 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.478 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.738 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.998 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:32.998 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.998 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:32.998 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:32.998 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:32.998 "name": "BaseBdev2", 00:14:32.998 "aliases": [ 00:14:32.998 "cf09b7ef-4f76-4de1-98a5-7a5d008614b0" 00:14:32.998 ], 00:14:32.998 "product_name": "Malloc disk", 00:14:32.998 "block_size": 512, 00:14:32.998 "num_blocks": 65536, 00:14:32.998 "uuid": "cf09b7ef-4f76-4de1-98a5-7a5d008614b0", 00:14:32.998 "assigned_rate_limits": { 00:14:32.998 "rw_ios_per_sec": 0, 00:14:32.998 "rw_mbytes_per_sec": 0, 00:14:32.998 "r_mbytes_per_sec": 0, 00:14:32.998 "w_mbytes_per_sec": 0 00:14:32.998 }, 00:14:32.998 "claimed": true, 00:14:32.998 "claim_type": "exclusive_write", 00:14:32.998 "zoned": false, 00:14:32.998 "supported_io_types": { 00:14:32.998 "read": true, 00:14:32.998 "write": true, 00:14:32.998 "unmap": true, 00:14:32.998 "flush": true, 00:14:32.998 "reset": true, 00:14:32.998 "nvme_admin": false, 00:14:32.998 "nvme_io": false, 00:14:32.998 "nvme_io_md": false, 00:14:32.998 "write_zeroes": true, 00:14:32.998 "zcopy": true, 00:14:32.998 "get_zone_info": false, 00:14:32.998 "zone_management": false, 00:14:32.998 "zone_append": false, 00:14:32.998 "compare": false, 00:14:32.998 "compare_and_write": false, 00:14:32.998 "abort": true, 00:14:32.998 "seek_hole": false, 00:14:32.998 "seek_data": false, 00:14:32.998 "copy": true, 00:14:32.998 "nvme_iov_md": false 00:14:32.998 }, 00:14:32.998 "memory_domains": [ 00:14:32.998 { 00:14:32.998 "dma_device_id": "system", 00:14:32.998 "dma_device_type": 1 00:14:32.998 }, 00:14:32.998 { 00:14:32.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.998 "dma_device_type": 2 00:14:32.998 } 00:14:32.998 ], 00:14:32.998 "driver_specific": {} 00:14:32.998 }' 00:14:32.998 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.998 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.258 07:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.258 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.258 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:33.258 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:33.518 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:33.518 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:33.518 "name": "BaseBdev3", 00:14:33.518 "aliases": [ 00:14:33.518 "1c84c62b-6044-4b72-ab6e-593ae374a90c" 00:14:33.518 ], 00:14:33.518 "product_name": "Malloc disk", 00:14:33.518 "block_size": 512, 00:14:33.518 "num_blocks": 65536, 00:14:33.518 "uuid": "1c84c62b-6044-4b72-ab6e-593ae374a90c", 00:14:33.518 "assigned_rate_limits": { 00:14:33.518 "rw_ios_per_sec": 0, 00:14:33.518 "rw_mbytes_per_sec": 0, 00:14:33.518 "r_mbytes_per_sec": 0, 00:14:33.518 "w_mbytes_per_sec": 0 00:14:33.518 }, 00:14:33.518 "claimed": true, 00:14:33.518 "claim_type": "exclusive_write", 00:14:33.518 "zoned": false, 00:14:33.518 "supported_io_types": { 00:14:33.518 "read": true, 00:14:33.518 "write": true, 00:14:33.518 "unmap": true, 00:14:33.518 "flush": true, 00:14:33.518 "reset": true, 00:14:33.518 "nvme_admin": false, 00:14:33.518 "nvme_io": false, 00:14:33.518 "nvme_io_md": false, 00:14:33.518 "write_zeroes": true, 00:14:33.518 "zcopy": true, 00:14:33.518 "get_zone_info": false, 00:14:33.518 "zone_management": false, 00:14:33.518 "zone_append": false, 00:14:33.518 "compare": false, 00:14:33.518 "compare_and_write": false, 00:14:33.518 "abort": true, 00:14:33.518 "seek_hole": false, 00:14:33.518 "seek_data": false, 00:14:33.518 "copy": true, 00:14:33.518 "nvme_iov_md": false 00:14:33.518 }, 00:14:33.518 "memory_domains": [ 00:14:33.518 { 00:14:33.518 "dma_device_id": "system", 00:14:33.518 "dma_device_type": 1 00:14:33.518 }, 00:14:33.518 { 00:14:33.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.518 "dma_device_type": 2 00:14:33.518 } 00:14:33.518 ], 00:14:33.518 "driver_specific": {} 00:14:33.518 }' 00:14:33.518 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.518 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.779 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:34.039 [2024-07-15 07:50:18.686147] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:34.039 [2024-07-15 07:50:18.686164] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:34.039 [2024-07-15 07:50:18.686198] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:34.039 [2024-07-15 07:50:18.686232] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:34.039 [2024-07-15 07:50:18.686238] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13536f0 name Existed_Raid, state offline 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1628418 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1628418 ']' 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1628418 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1628418 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1628418' 00:14:34.039 killing process with pid 1628418 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1628418 00:14:34.039 [2024-07-15 07:50:18.753552] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:34.039 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1628418 00:14:34.039 [2024-07-15 07:50:18.768257] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:34.300 07:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:34.300 00:14:34.300 real 0m23.962s 00:14:34.300 user 0m45.053s 00:14:34.300 sys 0m3.442s 00:14:34.300 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:34.300 07:50:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:34.300 ************************************ 00:14:34.300 END TEST raid_state_function_test_sb 00:14:34.300 ************************************ 00:14:34.300 07:50:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:34.300 07:50:18 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:14:34.300 07:50:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:34.300 07:50:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:34.300 07:50:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:34.300 ************************************ 00:14:34.300 START TEST raid_superblock_test 00:14:34.300 ************************************ 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1633098 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1633098 /var/tmp/spdk-raid.sock 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1633098 ']' 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:34.300 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:34.300 07:50:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:34.300 [2024-07-15 07:50:19.033626] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:14:34.300 [2024-07-15 07:50:19.033683] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1633098 ] 00:14:34.561 [2024-07-15 07:50:19.124858] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:34.561 [2024-07-15 07:50:19.202267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:34.561 [2024-07-15 07:50:19.255467] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:34.561 [2024-07-15 07:50:19.255494] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:35.133 07:50:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:35.417 malloc1 00:14:35.417 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:35.677 [2024-07-15 07:50:20.222821] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:35.677 [2024-07-15 07:50:20.222858] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:35.677 [2024-07-15 07:50:20.222870] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c0a20 00:14:35.677 [2024-07-15 07:50:20.222876] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:35.677 [2024-07-15 07:50:20.224297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:35.677 [2024-07-15 07:50:20.224318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:35.677 pt1 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:35.677 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:35.937 malloc2 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:35.937 [2024-07-15 07:50:20.609769] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:35.937 [2024-07-15 07:50:20.609798] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:35.937 [2024-07-15 07:50:20.609808] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c1040 00:14:35.937 [2024-07-15 07:50:20.609815] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:35.937 [2024-07-15 07:50:20.610992] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:35.937 [2024-07-15 07:50:20.611011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:35.937 pt2 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:35.937 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:35.938 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:36.198 malloc3 00:14:36.198 07:50:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:36.458 [2024-07-15 07:50:20.992744] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:36.458 [2024-07-15 07:50:20.992771] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:36.458 [2024-07-15 07:50:20.992780] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c1540 00:14:36.458 [2024-07-15 07:50:20.992787] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:36.458 [2024-07-15 07:50:20.993973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:36.458 [2024-07-15 07:50:20.993993] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:36.458 pt3 00:14:36.458 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:36.458 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:36.458 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:36.459 [2024-07-15 07:50:21.169203] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:36.459 [2024-07-15 07:50:21.170208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:36.459 [2024-07-15 07:50:21.170248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:36.459 [2024-07-15 07:50:21.170363] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x136da90 00:14:36.459 [2024-07-15 07:50:21.170370] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:36.459 [2024-07-15 07:50:21.170525] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1369c50 00:14:36.459 [2024-07-15 07:50:21.170632] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x136da90 00:14:36.459 [2024-07-15 07:50:21.170638] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x136da90 00:14:36.459 [2024-07-15 07:50:21.170706] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.459 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:36.719 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.719 "name": "raid_bdev1", 00:14:36.719 "uuid": "ff995da8-4cf3-4d31-b09a-a7d04163c53c", 00:14:36.719 "strip_size_kb": 64, 00:14:36.719 "state": "online", 00:14:36.719 "raid_level": "concat", 00:14:36.719 "superblock": true, 00:14:36.719 "num_base_bdevs": 3, 00:14:36.719 "num_base_bdevs_discovered": 3, 00:14:36.719 "num_base_bdevs_operational": 3, 00:14:36.719 "base_bdevs_list": [ 00:14:36.719 { 00:14:36.719 "name": "pt1", 00:14:36.719 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:36.719 "is_configured": true, 00:14:36.719 "data_offset": 2048, 00:14:36.719 "data_size": 63488 00:14:36.719 }, 00:14:36.719 { 00:14:36.719 "name": "pt2", 00:14:36.719 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:36.719 "is_configured": true, 00:14:36.719 "data_offset": 2048, 00:14:36.719 "data_size": 63488 00:14:36.719 }, 00:14:36.719 { 00:14:36.719 "name": "pt3", 00:14:36.719 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:36.719 "is_configured": true, 00:14:36.719 "data_offset": 2048, 00:14:36.719 "data_size": 63488 00:14:36.719 } 00:14:36.719 ] 00:14:36.719 }' 00:14:36.719 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.719 07:50:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.290 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:37.291 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:37.291 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:37.291 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:37.291 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:37.291 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:37.291 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:37.291 07:50:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:37.551 [2024-07-15 07:50:22.075689] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:37.551 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:37.551 "name": "raid_bdev1", 00:14:37.551 "aliases": [ 00:14:37.551 "ff995da8-4cf3-4d31-b09a-a7d04163c53c" 00:14:37.551 ], 00:14:37.551 "product_name": "Raid Volume", 00:14:37.551 "block_size": 512, 00:14:37.551 "num_blocks": 190464, 00:14:37.551 "uuid": "ff995da8-4cf3-4d31-b09a-a7d04163c53c", 00:14:37.551 "assigned_rate_limits": { 00:14:37.551 "rw_ios_per_sec": 0, 00:14:37.551 "rw_mbytes_per_sec": 0, 00:14:37.551 "r_mbytes_per_sec": 0, 00:14:37.551 "w_mbytes_per_sec": 0 00:14:37.551 }, 00:14:37.551 "claimed": false, 00:14:37.551 "zoned": false, 00:14:37.551 "supported_io_types": { 00:14:37.551 "read": true, 00:14:37.551 "write": true, 00:14:37.551 "unmap": true, 00:14:37.551 "flush": true, 00:14:37.551 "reset": true, 00:14:37.552 "nvme_admin": false, 00:14:37.552 "nvme_io": false, 00:14:37.552 "nvme_io_md": false, 00:14:37.552 "write_zeroes": true, 00:14:37.552 "zcopy": false, 00:14:37.552 "get_zone_info": false, 00:14:37.552 "zone_management": false, 00:14:37.552 "zone_append": false, 00:14:37.552 "compare": false, 00:14:37.552 "compare_and_write": false, 00:14:37.552 "abort": false, 00:14:37.552 "seek_hole": false, 00:14:37.552 "seek_data": false, 00:14:37.552 "copy": false, 00:14:37.552 "nvme_iov_md": false 00:14:37.552 }, 00:14:37.552 "memory_domains": [ 00:14:37.552 { 00:14:37.552 "dma_device_id": "system", 00:14:37.552 "dma_device_type": 1 00:14:37.552 }, 00:14:37.552 { 00:14:37.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.552 "dma_device_type": 2 00:14:37.552 }, 00:14:37.552 { 00:14:37.552 "dma_device_id": "system", 00:14:37.552 "dma_device_type": 1 00:14:37.552 }, 00:14:37.552 { 00:14:37.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.552 "dma_device_type": 2 00:14:37.552 }, 00:14:37.552 { 00:14:37.552 "dma_device_id": "system", 00:14:37.552 "dma_device_type": 1 00:14:37.552 }, 00:14:37.552 { 00:14:37.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.552 "dma_device_type": 2 00:14:37.552 } 00:14:37.552 ], 00:14:37.552 "driver_specific": { 00:14:37.552 "raid": { 00:14:37.552 "uuid": "ff995da8-4cf3-4d31-b09a-a7d04163c53c", 00:14:37.552 "strip_size_kb": 64, 00:14:37.552 "state": "online", 00:14:37.552 "raid_level": "concat", 00:14:37.552 "superblock": true, 00:14:37.552 "num_base_bdevs": 3, 00:14:37.552 "num_base_bdevs_discovered": 3, 00:14:37.552 "num_base_bdevs_operational": 3, 00:14:37.552 "base_bdevs_list": [ 00:14:37.552 { 00:14:37.552 "name": "pt1", 00:14:37.552 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:37.552 "is_configured": true, 00:14:37.552 "data_offset": 2048, 00:14:37.552 "data_size": 63488 00:14:37.552 }, 00:14:37.552 { 00:14:37.552 "name": "pt2", 00:14:37.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:37.552 "is_configured": true, 00:14:37.552 "data_offset": 2048, 00:14:37.552 "data_size": 63488 00:14:37.552 }, 00:14:37.552 { 00:14:37.552 "name": "pt3", 00:14:37.552 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:37.552 "is_configured": true, 00:14:37.552 "data_offset": 2048, 00:14:37.552 "data_size": 63488 00:14:37.552 } 00:14:37.552 ] 00:14:37.552 } 00:14:37.552 } 00:14:37.552 }' 00:14:37.552 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:37.552 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:37.552 pt2 00:14:37.552 pt3' 00:14:37.552 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:37.552 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:37.552 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:37.813 "name": "pt1", 00:14:37.813 "aliases": [ 00:14:37.813 "00000000-0000-0000-0000-000000000001" 00:14:37.813 ], 00:14:37.813 "product_name": "passthru", 00:14:37.813 "block_size": 512, 00:14:37.813 "num_blocks": 65536, 00:14:37.813 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:37.813 "assigned_rate_limits": { 00:14:37.813 "rw_ios_per_sec": 0, 00:14:37.813 "rw_mbytes_per_sec": 0, 00:14:37.813 "r_mbytes_per_sec": 0, 00:14:37.813 "w_mbytes_per_sec": 0 00:14:37.813 }, 00:14:37.813 "claimed": true, 00:14:37.813 "claim_type": "exclusive_write", 00:14:37.813 "zoned": false, 00:14:37.813 "supported_io_types": { 00:14:37.813 "read": true, 00:14:37.813 "write": true, 00:14:37.813 "unmap": true, 00:14:37.813 "flush": true, 00:14:37.813 "reset": true, 00:14:37.813 "nvme_admin": false, 00:14:37.813 "nvme_io": false, 00:14:37.813 "nvme_io_md": false, 00:14:37.813 "write_zeroes": true, 00:14:37.813 "zcopy": true, 00:14:37.813 "get_zone_info": false, 00:14:37.813 "zone_management": false, 00:14:37.813 "zone_append": false, 00:14:37.813 "compare": false, 00:14:37.813 "compare_and_write": false, 00:14:37.813 "abort": true, 00:14:37.813 "seek_hole": false, 00:14:37.813 "seek_data": false, 00:14:37.813 "copy": true, 00:14:37.813 "nvme_iov_md": false 00:14:37.813 }, 00:14:37.813 "memory_domains": [ 00:14:37.813 { 00:14:37.813 "dma_device_id": "system", 00:14:37.813 "dma_device_type": 1 00:14:37.813 }, 00:14:37.813 { 00:14:37.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.813 "dma_device_type": 2 00:14:37.813 } 00:14:37.813 ], 00:14:37.813 "driver_specific": { 00:14:37.813 "passthru": { 00:14:37.813 "name": "pt1", 00:14:37.813 "base_bdev_name": "malloc1" 00:14:37.813 } 00:14:37.813 } 00:14:37.813 }' 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:37.813 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.074 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.074 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:38.074 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:38.074 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:38.074 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:38.074 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:38.074 "name": "pt2", 00:14:38.074 "aliases": [ 00:14:38.074 "00000000-0000-0000-0000-000000000002" 00:14:38.074 ], 00:14:38.074 "product_name": "passthru", 00:14:38.074 "block_size": 512, 00:14:38.074 "num_blocks": 65536, 00:14:38.074 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:38.074 "assigned_rate_limits": { 00:14:38.074 "rw_ios_per_sec": 0, 00:14:38.074 "rw_mbytes_per_sec": 0, 00:14:38.074 "r_mbytes_per_sec": 0, 00:14:38.074 "w_mbytes_per_sec": 0 00:14:38.074 }, 00:14:38.074 "claimed": true, 00:14:38.074 "claim_type": "exclusive_write", 00:14:38.074 "zoned": false, 00:14:38.074 "supported_io_types": { 00:14:38.074 "read": true, 00:14:38.074 "write": true, 00:14:38.074 "unmap": true, 00:14:38.074 "flush": true, 00:14:38.074 "reset": true, 00:14:38.074 "nvme_admin": false, 00:14:38.074 "nvme_io": false, 00:14:38.074 "nvme_io_md": false, 00:14:38.074 "write_zeroes": true, 00:14:38.074 "zcopy": true, 00:14:38.074 "get_zone_info": false, 00:14:38.074 "zone_management": false, 00:14:38.074 "zone_append": false, 00:14:38.074 "compare": false, 00:14:38.074 "compare_and_write": false, 00:14:38.074 "abort": true, 00:14:38.074 "seek_hole": false, 00:14:38.074 "seek_data": false, 00:14:38.074 "copy": true, 00:14:38.074 "nvme_iov_md": false 00:14:38.074 }, 00:14:38.074 "memory_domains": [ 00:14:38.074 { 00:14:38.074 "dma_device_id": "system", 00:14:38.074 "dma_device_type": 1 00:14:38.074 }, 00:14:38.074 { 00:14:38.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.074 "dma_device_type": 2 00:14:38.074 } 00:14:38.074 ], 00:14:38.074 "driver_specific": { 00:14:38.074 "passthru": { 00:14:38.074 "name": "pt2", 00:14:38.074 "base_bdev_name": "malloc2" 00:14:38.074 } 00:14:38.074 } 00:14:38.074 }' 00:14:38.074 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.334 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.334 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:38.334 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.334 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.334 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:38.334 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.334 07:50:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.334 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:38.334 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.334 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.607 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:38.607 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:38.607 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:38.607 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:38.607 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:38.607 "name": "pt3", 00:14:38.607 "aliases": [ 00:14:38.607 "00000000-0000-0000-0000-000000000003" 00:14:38.607 ], 00:14:38.607 "product_name": "passthru", 00:14:38.607 "block_size": 512, 00:14:38.607 "num_blocks": 65536, 00:14:38.607 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:38.607 "assigned_rate_limits": { 00:14:38.607 "rw_ios_per_sec": 0, 00:14:38.607 "rw_mbytes_per_sec": 0, 00:14:38.607 "r_mbytes_per_sec": 0, 00:14:38.607 "w_mbytes_per_sec": 0 00:14:38.607 }, 00:14:38.607 "claimed": true, 00:14:38.607 "claim_type": "exclusive_write", 00:14:38.607 "zoned": false, 00:14:38.607 "supported_io_types": { 00:14:38.607 "read": true, 00:14:38.607 "write": true, 00:14:38.607 "unmap": true, 00:14:38.607 "flush": true, 00:14:38.607 "reset": true, 00:14:38.607 "nvme_admin": false, 00:14:38.607 "nvme_io": false, 00:14:38.607 "nvme_io_md": false, 00:14:38.607 "write_zeroes": true, 00:14:38.607 "zcopy": true, 00:14:38.607 "get_zone_info": false, 00:14:38.607 "zone_management": false, 00:14:38.607 "zone_append": false, 00:14:38.607 "compare": false, 00:14:38.607 "compare_and_write": false, 00:14:38.607 "abort": true, 00:14:38.607 "seek_hole": false, 00:14:38.607 "seek_data": false, 00:14:38.607 "copy": true, 00:14:38.607 "nvme_iov_md": false 00:14:38.607 }, 00:14:38.607 "memory_domains": [ 00:14:38.607 { 00:14:38.607 "dma_device_id": "system", 00:14:38.607 "dma_device_type": 1 00:14:38.607 }, 00:14:38.607 { 00:14:38.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.607 "dma_device_type": 2 00:14:38.607 } 00:14:38.607 ], 00:14:38.607 "driver_specific": { 00:14:38.607 "passthru": { 00:14:38.607 "name": "pt3", 00:14:38.607 "base_bdev_name": "malloc3" 00:14:38.607 } 00:14:38.607 } 00:14:38.607 }' 00:14:38.607 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.607 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.866 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:38.866 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.866 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.866 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:38.866 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.866 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.866 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:38.866 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.125 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.125 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:39.125 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:39.125 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:39.125 [2024-07-15 07:50:23.868225] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:39.385 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ff995da8-4cf3-4d31-b09a-a7d04163c53c 00:14:39.385 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ff995da8-4cf3-4d31-b09a-a7d04163c53c ']' 00:14:39.385 07:50:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:39.385 [2024-07-15 07:50:24.060494] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:39.385 [2024-07-15 07:50:24.060507] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:39.385 [2024-07-15 07:50:24.060540] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:39.385 [2024-07-15 07:50:24.060577] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:39.385 [2024-07-15 07:50:24.060583] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x136da90 name raid_bdev1, state offline 00:14:39.385 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.385 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:39.644 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:39.644 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:39.644 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:39.645 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:39.904 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:39.904 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:39.904 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:39.904 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:40.164 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:40.164 07:50:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:40.424 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:40.684 [2024-07-15 07:50:25.199343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:40.684 [2024-07-15 07:50:25.200400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:40.684 [2024-07-15 07:50:25.200433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:40.684 [2024-07-15 07:50:25.200465] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:40.684 [2024-07-15 07:50:25.200491] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:40.684 [2024-07-15 07:50:25.200504] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:40.684 [2024-07-15 07:50:25.200514] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:40.684 [2024-07-15 07:50:25.200520] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1369bf0 name raid_bdev1, state configuring 00:14:40.684 request: 00:14:40.684 { 00:14:40.684 "name": "raid_bdev1", 00:14:40.684 "raid_level": "concat", 00:14:40.684 "base_bdevs": [ 00:14:40.684 "malloc1", 00:14:40.684 "malloc2", 00:14:40.684 "malloc3" 00:14:40.684 ], 00:14:40.684 "strip_size_kb": 64, 00:14:40.684 "superblock": false, 00:14:40.684 "method": "bdev_raid_create", 00:14:40.684 "req_id": 1 00:14:40.684 } 00:14:40.684 Got JSON-RPC error response 00:14:40.684 response: 00:14:40.684 { 00:14:40.684 "code": -17, 00:14:40.684 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:40.684 } 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:40.684 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:40.944 [2024-07-15 07:50:25.584265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:40.944 [2024-07-15 07:50:25.584284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:40.944 [2024-07-15 07:50:25.584295] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c1e00 00:14:40.944 [2024-07-15 07:50:25.584301] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:40.944 [2024-07-15 07:50:25.585545] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:40.944 [2024-07-15 07:50:25.585565] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:40.944 [2024-07-15 07:50:25.585607] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:40.944 [2024-07-15 07:50:25.585625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:40.944 pt1 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.944 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:41.203 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.203 "name": "raid_bdev1", 00:14:41.203 "uuid": "ff995da8-4cf3-4d31-b09a-a7d04163c53c", 00:14:41.203 "strip_size_kb": 64, 00:14:41.203 "state": "configuring", 00:14:41.203 "raid_level": "concat", 00:14:41.203 "superblock": true, 00:14:41.203 "num_base_bdevs": 3, 00:14:41.203 "num_base_bdevs_discovered": 1, 00:14:41.203 "num_base_bdevs_operational": 3, 00:14:41.203 "base_bdevs_list": [ 00:14:41.203 { 00:14:41.203 "name": "pt1", 00:14:41.203 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:41.203 "is_configured": true, 00:14:41.203 "data_offset": 2048, 00:14:41.203 "data_size": 63488 00:14:41.203 }, 00:14:41.203 { 00:14:41.203 "name": null, 00:14:41.203 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:41.203 "is_configured": false, 00:14:41.203 "data_offset": 2048, 00:14:41.203 "data_size": 63488 00:14:41.203 }, 00:14:41.203 { 00:14:41.203 "name": null, 00:14:41.203 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:41.203 "is_configured": false, 00:14:41.203 "data_offset": 2048, 00:14:41.203 "data_size": 63488 00:14:41.203 } 00:14:41.203 ] 00:14:41.203 }' 00:14:41.203 07:50:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.203 07:50:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.773 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:41.773 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:42.069 [2024-07-15 07:50:26.546702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:42.069 [2024-07-15 07:50:26.546741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:42.069 [2024-07-15 07:50:26.546751] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x136ff60 00:14:42.069 [2024-07-15 07:50:26.546757] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:42.069 [2024-07-15 07:50:26.547014] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:42.069 [2024-07-15 07:50:26.547025] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:42.069 [2024-07-15 07:50:26.547066] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:42.069 [2024-07-15 07:50:26.547079] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:42.069 pt2 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:42.069 [2024-07-15 07:50:26.723150] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.069 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:42.347 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.347 "name": "raid_bdev1", 00:14:42.347 "uuid": "ff995da8-4cf3-4d31-b09a-a7d04163c53c", 00:14:42.347 "strip_size_kb": 64, 00:14:42.347 "state": "configuring", 00:14:42.347 "raid_level": "concat", 00:14:42.347 "superblock": true, 00:14:42.347 "num_base_bdevs": 3, 00:14:42.347 "num_base_bdevs_discovered": 1, 00:14:42.347 "num_base_bdevs_operational": 3, 00:14:42.347 "base_bdevs_list": [ 00:14:42.347 { 00:14:42.347 "name": "pt1", 00:14:42.347 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:42.347 "is_configured": true, 00:14:42.347 "data_offset": 2048, 00:14:42.347 "data_size": 63488 00:14:42.347 }, 00:14:42.347 { 00:14:42.347 "name": null, 00:14:42.347 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:42.347 "is_configured": false, 00:14:42.347 "data_offset": 2048, 00:14:42.347 "data_size": 63488 00:14:42.347 }, 00:14:42.347 { 00:14:42.347 "name": null, 00:14:42.347 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:42.347 "is_configured": false, 00:14:42.347 "data_offset": 2048, 00:14:42.347 "data_size": 63488 00:14:42.347 } 00:14:42.347 ] 00:14:42.347 }' 00:14:42.347 07:50:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.347 07:50:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.916 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:42.916 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:42.916 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:42.916 [2024-07-15 07:50:27.657509] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:42.916 [2024-07-15 07:50:27.657536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:42.916 [2024-07-15 07:50:27.657545] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x136aaa0 00:14:42.916 [2024-07-15 07:50:27.657552] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:42.916 [2024-07-15 07:50:27.657812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:42.916 [2024-07-15 07:50:27.657824] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:42.916 [2024-07-15 07:50:27.657862] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:42.916 [2024-07-15 07:50:27.657874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:42.916 pt2 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:43.177 [2024-07-15 07:50:27.841974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:43.177 [2024-07-15 07:50:27.841991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:43.177 [2024-07-15 07:50:27.842000] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1372410 00:14:43.177 [2024-07-15 07:50:27.842006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:43.177 [2024-07-15 07:50:27.842220] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:43.177 [2024-07-15 07:50:27.842230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:43.177 [2024-07-15 07:50:27.842260] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:43.177 [2024-07-15 07:50:27.842270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:43.177 [2024-07-15 07:50:27.842347] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x136adc0 00:14:43.177 [2024-07-15 07:50:27.842353] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:43.177 [2024-07-15 07:50:27.842493] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1372230 00:14:43.177 [2024-07-15 07:50:27.842588] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x136adc0 00:14:43.177 [2024-07-15 07:50:27.842593] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x136adc0 00:14:43.177 [2024-07-15 07:50:27.842664] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:43.177 pt3 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.177 07:50:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:43.437 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.437 "name": "raid_bdev1", 00:14:43.437 "uuid": "ff995da8-4cf3-4d31-b09a-a7d04163c53c", 00:14:43.437 "strip_size_kb": 64, 00:14:43.437 "state": "online", 00:14:43.437 "raid_level": "concat", 00:14:43.437 "superblock": true, 00:14:43.437 "num_base_bdevs": 3, 00:14:43.437 "num_base_bdevs_discovered": 3, 00:14:43.437 "num_base_bdevs_operational": 3, 00:14:43.437 "base_bdevs_list": [ 00:14:43.437 { 00:14:43.437 "name": "pt1", 00:14:43.437 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:43.437 "is_configured": true, 00:14:43.437 "data_offset": 2048, 00:14:43.437 "data_size": 63488 00:14:43.437 }, 00:14:43.437 { 00:14:43.437 "name": "pt2", 00:14:43.437 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:43.437 "is_configured": true, 00:14:43.437 "data_offset": 2048, 00:14:43.437 "data_size": 63488 00:14:43.437 }, 00:14:43.437 { 00:14:43.437 "name": "pt3", 00:14:43.437 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:43.437 "is_configured": true, 00:14:43.437 "data_offset": 2048, 00:14:43.437 "data_size": 63488 00:14:43.437 } 00:14:43.437 ] 00:14:43.437 }' 00:14:43.437 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.437 07:50:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:44.006 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:44.006 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:44.006 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:44.006 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:44.006 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:44.006 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:44.006 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:44.006 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:44.266 [2024-07-15 07:50:28.828693] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:44.266 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:44.266 "name": "raid_bdev1", 00:14:44.266 "aliases": [ 00:14:44.266 "ff995da8-4cf3-4d31-b09a-a7d04163c53c" 00:14:44.266 ], 00:14:44.266 "product_name": "Raid Volume", 00:14:44.266 "block_size": 512, 00:14:44.266 "num_blocks": 190464, 00:14:44.266 "uuid": "ff995da8-4cf3-4d31-b09a-a7d04163c53c", 00:14:44.266 "assigned_rate_limits": { 00:14:44.266 "rw_ios_per_sec": 0, 00:14:44.266 "rw_mbytes_per_sec": 0, 00:14:44.266 "r_mbytes_per_sec": 0, 00:14:44.266 "w_mbytes_per_sec": 0 00:14:44.266 }, 00:14:44.266 "claimed": false, 00:14:44.266 "zoned": false, 00:14:44.266 "supported_io_types": { 00:14:44.266 "read": true, 00:14:44.266 "write": true, 00:14:44.266 "unmap": true, 00:14:44.266 "flush": true, 00:14:44.266 "reset": true, 00:14:44.266 "nvme_admin": false, 00:14:44.266 "nvme_io": false, 00:14:44.266 "nvme_io_md": false, 00:14:44.266 "write_zeroes": true, 00:14:44.266 "zcopy": false, 00:14:44.266 "get_zone_info": false, 00:14:44.266 "zone_management": false, 00:14:44.266 "zone_append": false, 00:14:44.266 "compare": false, 00:14:44.266 "compare_and_write": false, 00:14:44.266 "abort": false, 00:14:44.266 "seek_hole": false, 00:14:44.266 "seek_data": false, 00:14:44.266 "copy": false, 00:14:44.266 "nvme_iov_md": false 00:14:44.266 }, 00:14:44.266 "memory_domains": [ 00:14:44.266 { 00:14:44.266 "dma_device_id": "system", 00:14:44.266 "dma_device_type": 1 00:14:44.266 }, 00:14:44.266 { 00:14:44.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.266 "dma_device_type": 2 00:14:44.266 }, 00:14:44.266 { 00:14:44.266 "dma_device_id": "system", 00:14:44.266 "dma_device_type": 1 00:14:44.266 }, 00:14:44.266 { 00:14:44.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.266 "dma_device_type": 2 00:14:44.266 }, 00:14:44.266 { 00:14:44.266 "dma_device_id": "system", 00:14:44.266 "dma_device_type": 1 00:14:44.266 }, 00:14:44.266 { 00:14:44.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.266 "dma_device_type": 2 00:14:44.266 } 00:14:44.266 ], 00:14:44.266 "driver_specific": { 00:14:44.266 "raid": { 00:14:44.266 "uuid": "ff995da8-4cf3-4d31-b09a-a7d04163c53c", 00:14:44.266 "strip_size_kb": 64, 00:14:44.266 "state": "online", 00:14:44.266 "raid_level": "concat", 00:14:44.266 "superblock": true, 00:14:44.266 "num_base_bdevs": 3, 00:14:44.266 "num_base_bdevs_discovered": 3, 00:14:44.266 "num_base_bdevs_operational": 3, 00:14:44.266 "base_bdevs_list": [ 00:14:44.266 { 00:14:44.266 "name": "pt1", 00:14:44.266 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:44.266 "is_configured": true, 00:14:44.266 "data_offset": 2048, 00:14:44.266 "data_size": 63488 00:14:44.266 }, 00:14:44.266 { 00:14:44.266 "name": "pt2", 00:14:44.266 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:44.266 "is_configured": true, 00:14:44.266 "data_offset": 2048, 00:14:44.266 "data_size": 63488 00:14:44.266 }, 00:14:44.266 { 00:14:44.266 "name": "pt3", 00:14:44.266 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:44.266 "is_configured": true, 00:14:44.266 "data_offset": 2048, 00:14:44.266 "data_size": 63488 00:14:44.266 } 00:14:44.266 ] 00:14:44.266 } 00:14:44.266 } 00:14:44.266 }' 00:14:44.266 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:44.266 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:44.266 pt2 00:14:44.266 pt3' 00:14:44.266 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.266 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:44.266 07:50:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:44.525 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:44.525 "name": "pt1", 00:14:44.525 "aliases": [ 00:14:44.525 "00000000-0000-0000-0000-000000000001" 00:14:44.525 ], 00:14:44.525 "product_name": "passthru", 00:14:44.525 "block_size": 512, 00:14:44.525 "num_blocks": 65536, 00:14:44.525 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:44.525 "assigned_rate_limits": { 00:14:44.525 "rw_ios_per_sec": 0, 00:14:44.525 "rw_mbytes_per_sec": 0, 00:14:44.525 "r_mbytes_per_sec": 0, 00:14:44.525 "w_mbytes_per_sec": 0 00:14:44.525 }, 00:14:44.525 "claimed": true, 00:14:44.525 "claim_type": "exclusive_write", 00:14:44.525 "zoned": false, 00:14:44.525 "supported_io_types": { 00:14:44.525 "read": true, 00:14:44.525 "write": true, 00:14:44.525 "unmap": true, 00:14:44.525 "flush": true, 00:14:44.525 "reset": true, 00:14:44.525 "nvme_admin": false, 00:14:44.525 "nvme_io": false, 00:14:44.525 "nvme_io_md": false, 00:14:44.525 "write_zeroes": true, 00:14:44.525 "zcopy": true, 00:14:44.525 "get_zone_info": false, 00:14:44.525 "zone_management": false, 00:14:44.525 "zone_append": false, 00:14:44.525 "compare": false, 00:14:44.525 "compare_and_write": false, 00:14:44.525 "abort": true, 00:14:44.525 "seek_hole": false, 00:14:44.525 "seek_data": false, 00:14:44.525 "copy": true, 00:14:44.525 "nvme_iov_md": false 00:14:44.525 }, 00:14:44.525 "memory_domains": [ 00:14:44.525 { 00:14:44.525 "dma_device_id": "system", 00:14:44.525 "dma_device_type": 1 00:14:44.525 }, 00:14:44.525 { 00:14:44.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.525 "dma_device_type": 2 00:14:44.525 } 00:14:44.525 ], 00:14:44.525 "driver_specific": { 00:14:44.525 "passthru": { 00:14:44.525 "name": "pt1", 00:14:44.525 "base_bdev_name": "malloc1" 00:14:44.525 } 00:14:44.525 } 00:14:44.525 }' 00:14:44.525 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.525 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.525 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:44.525 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.783 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.783 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:44.783 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.783 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.783 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:44.783 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.042 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.042 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.042 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:45.042 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:45.042 07:50:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:45.610 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:45.610 "name": "pt2", 00:14:45.610 "aliases": [ 00:14:45.610 "00000000-0000-0000-0000-000000000002" 00:14:45.610 ], 00:14:45.610 "product_name": "passthru", 00:14:45.610 "block_size": 512, 00:14:45.610 "num_blocks": 65536, 00:14:45.610 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:45.610 "assigned_rate_limits": { 00:14:45.610 "rw_ios_per_sec": 0, 00:14:45.610 "rw_mbytes_per_sec": 0, 00:14:45.610 "r_mbytes_per_sec": 0, 00:14:45.610 "w_mbytes_per_sec": 0 00:14:45.610 }, 00:14:45.610 "claimed": true, 00:14:45.610 "claim_type": "exclusive_write", 00:14:45.610 "zoned": false, 00:14:45.610 "supported_io_types": { 00:14:45.610 "read": true, 00:14:45.610 "write": true, 00:14:45.610 "unmap": true, 00:14:45.610 "flush": true, 00:14:45.610 "reset": true, 00:14:45.610 "nvme_admin": false, 00:14:45.610 "nvme_io": false, 00:14:45.610 "nvme_io_md": false, 00:14:45.610 "write_zeroes": true, 00:14:45.610 "zcopy": true, 00:14:45.610 "get_zone_info": false, 00:14:45.610 "zone_management": false, 00:14:45.610 "zone_append": false, 00:14:45.610 "compare": false, 00:14:45.610 "compare_and_write": false, 00:14:45.610 "abort": true, 00:14:45.610 "seek_hole": false, 00:14:45.610 "seek_data": false, 00:14:45.610 "copy": true, 00:14:45.610 "nvme_iov_md": false 00:14:45.610 }, 00:14:45.610 "memory_domains": [ 00:14:45.610 { 00:14:45.610 "dma_device_id": "system", 00:14:45.610 "dma_device_type": 1 00:14:45.610 }, 00:14:45.610 { 00:14:45.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.610 "dma_device_type": 2 00:14:45.610 } 00:14:45.610 ], 00:14:45.610 "driver_specific": { 00:14:45.610 "passthru": { 00:14:45.610 "name": "pt2", 00:14:45.610 "base_bdev_name": "malloc2" 00:14:45.610 } 00:14:45.610 } 00:14:45.610 }' 00:14:45.610 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.610 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.610 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.610 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.610 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:45.869 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:46.175 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:46.175 "name": "pt3", 00:14:46.175 "aliases": [ 00:14:46.175 "00000000-0000-0000-0000-000000000003" 00:14:46.175 ], 00:14:46.175 "product_name": "passthru", 00:14:46.175 "block_size": 512, 00:14:46.175 "num_blocks": 65536, 00:14:46.175 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:46.175 "assigned_rate_limits": { 00:14:46.175 "rw_ios_per_sec": 0, 00:14:46.175 "rw_mbytes_per_sec": 0, 00:14:46.175 "r_mbytes_per_sec": 0, 00:14:46.175 "w_mbytes_per_sec": 0 00:14:46.175 }, 00:14:46.175 "claimed": true, 00:14:46.175 "claim_type": "exclusive_write", 00:14:46.175 "zoned": false, 00:14:46.175 "supported_io_types": { 00:14:46.175 "read": true, 00:14:46.175 "write": true, 00:14:46.175 "unmap": true, 00:14:46.175 "flush": true, 00:14:46.175 "reset": true, 00:14:46.175 "nvme_admin": false, 00:14:46.175 "nvme_io": false, 00:14:46.175 "nvme_io_md": false, 00:14:46.175 "write_zeroes": true, 00:14:46.175 "zcopy": true, 00:14:46.175 "get_zone_info": false, 00:14:46.175 "zone_management": false, 00:14:46.175 "zone_append": false, 00:14:46.175 "compare": false, 00:14:46.175 "compare_and_write": false, 00:14:46.175 "abort": true, 00:14:46.175 "seek_hole": false, 00:14:46.175 "seek_data": false, 00:14:46.175 "copy": true, 00:14:46.175 "nvme_iov_md": false 00:14:46.176 }, 00:14:46.176 "memory_domains": [ 00:14:46.176 { 00:14:46.176 "dma_device_id": "system", 00:14:46.176 "dma_device_type": 1 00:14:46.176 }, 00:14:46.176 { 00:14:46.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.176 "dma_device_type": 2 00:14:46.176 } 00:14:46.176 ], 00:14:46.176 "driver_specific": { 00:14:46.176 "passthru": { 00:14:46.176 "name": "pt3", 00:14:46.176 "base_bdev_name": "malloc3" 00:14:46.176 } 00:14:46.176 } 00:14:46.176 }' 00:14:46.176 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.176 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.176 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:46.176 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.434 07:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.434 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:46.434 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.434 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.692 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:46.692 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.692 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.692 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:46.692 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:46.692 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:47.261 [2024-07-15 07:50:31.820238] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ff995da8-4cf3-4d31-b09a-a7d04163c53c '!=' ff995da8-4cf3-4d31-b09a-a7d04163c53c ']' 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1633098 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1633098 ']' 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1633098 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1633098 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1633098' 00:14:47.261 killing process with pid 1633098 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1633098 00:14:47.261 [2024-07-15 07:50:31.907494] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:47.261 [2024-07-15 07:50:31.907531] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:47.261 [2024-07-15 07:50:31.907568] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:47.261 [2024-07-15 07:50:31.907574] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x136adc0 name raid_bdev1, state offline 00:14:47.261 07:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1633098 00:14:47.261 [2024-07-15 07:50:31.922415] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:47.521 07:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:47.521 00:14:47.521 real 0m13.075s 00:14:47.521 user 0m24.159s 00:14:47.521 sys 0m1.845s 00:14:47.521 07:50:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:47.521 07:50:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.521 ************************************ 00:14:47.521 END TEST raid_superblock_test 00:14:47.521 ************************************ 00:14:47.521 07:50:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:47.521 07:50:32 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:14:47.521 07:50:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:47.521 07:50:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:47.521 07:50:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:47.521 ************************************ 00:14:47.521 START TEST raid_read_error_test 00:14:47.521 ************************************ 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.3n1NaDIf1K 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1635676 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1635676 /var/tmp/spdk-raid.sock 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1635676 ']' 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:47.521 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:47.521 07:50:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.521 [2024-07-15 07:50:32.186924] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:14:47.521 [2024-07-15 07:50:32.186978] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1635676 ] 00:14:47.521 [2024-07-15 07:50:32.276191] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:47.782 [2024-07-15 07:50:32.342382] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:47.782 [2024-07-15 07:50:32.381417] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:47.782 [2024-07-15 07:50:32.381442] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:48.351 07:50:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:48.351 07:50:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:48.351 07:50:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:48.351 07:50:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:48.920 BaseBdev1_malloc 00:14:48.920 07:50:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:49.179 true 00:14:49.179 07:50:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:49.747 [2024-07-15 07:50:34.309698] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:49.747 [2024-07-15 07:50:34.309735] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:49.747 [2024-07-15 07:50:34.309748] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x106eb50 00:14:49.747 [2024-07-15 07:50:34.309759] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:49.747 [2024-07-15 07:50:34.311082] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:49.747 [2024-07-15 07:50:34.311103] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:49.747 BaseBdev1 00:14:49.747 07:50:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:49.747 07:50:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:50.007 BaseBdev2_malloc 00:14:50.007 07:50:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:50.267 true 00:14:50.267 07:50:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:50.267 [2024-07-15 07:50:34.941303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:50.267 [2024-07-15 07:50:34.941330] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:50.267 [2024-07-15 07:50:34.941341] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1052ea0 00:14:50.267 [2024-07-15 07:50:34.941346] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:50.267 [2024-07-15 07:50:34.942517] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:50.267 [2024-07-15 07:50:34.942536] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:50.267 BaseBdev2 00:14:50.267 07:50:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:50.267 07:50:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:50.836 BaseBdev3_malloc 00:14:50.836 07:50:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:51.404 true 00:14:51.405 07:50:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:51.974 [2024-07-15 07:50:36.547269] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:51.974 [2024-07-15 07:50:36.547299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:51.974 [2024-07-15 07:50:36.547312] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1056fb0 00:14:51.974 [2024-07-15 07:50:36.547318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:51.974 [2024-07-15 07:50:36.548514] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:51.974 [2024-07-15 07:50:36.548535] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:51.974 BaseBdev3 00:14:51.974 07:50:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:52.543 [2024-07-15 07:50:37.088641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:52.543 [2024-07-15 07:50:37.089660] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:52.543 [2024-07-15 07:50:37.089719] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:52.543 [2024-07-15 07:50:37.089872] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10580e0 00:14:52.543 [2024-07-15 07:50:37.089879] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:52.543 [2024-07-15 07:50:37.090026] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xeba250 00:14:52.543 [2024-07-15 07:50:37.090138] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10580e0 00:14:52.543 [2024-07-15 07:50:37.090147] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10580e0 00:14:52.543 [2024-07-15 07:50:37.090221] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:52.543 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:52.543 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:52.543 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:52.543 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:52.543 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.543 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.543 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.544 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.544 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.544 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.544 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.544 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:53.113 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.113 "name": "raid_bdev1", 00:14:53.113 "uuid": "ce5159c7-2516-44e8-ae69-dcf2893be83e", 00:14:53.113 "strip_size_kb": 64, 00:14:53.113 "state": "online", 00:14:53.113 "raid_level": "concat", 00:14:53.113 "superblock": true, 00:14:53.113 "num_base_bdevs": 3, 00:14:53.113 "num_base_bdevs_discovered": 3, 00:14:53.113 "num_base_bdevs_operational": 3, 00:14:53.113 "base_bdevs_list": [ 00:14:53.113 { 00:14:53.113 "name": "BaseBdev1", 00:14:53.113 "uuid": "15638666-f8d9-5ad6-83a5-e44ce9311f46", 00:14:53.113 "is_configured": true, 00:14:53.113 "data_offset": 2048, 00:14:53.113 "data_size": 63488 00:14:53.113 }, 00:14:53.113 { 00:14:53.113 "name": "BaseBdev2", 00:14:53.113 "uuid": "6a0168fb-63a2-57cf-8779-6ea12239ff42", 00:14:53.113 "is_configured": true, 00:14:53.113 "data_offset": 2048, 00:14:53.113 "data_size": 63488 00:14:53.113 }, 00:14:53.113 { 00:14:53.113 "name": "BaseBdev3", 00:14:53.113 "uuid": "9935fd1a-b9ec-56ba-a388-589eea433d11", 00:14:53.113 "is_configured": true, 00:14:53.113 "data_offset": 2048, 00:14:53.113 "data_size": 63488 00:14:53.113 } 00:14:53.113 ] 00:14:53.113 }' 00:14:53.113 07:50:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.113 07:50:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.051 07:50:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:54.051 07:50:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:54.051 [2024-07-15 07:50:38.696984] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1057da0 00:14:54.990 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.250 07:50:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:55.818 07:50:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.818 "name": "raid_bdev1", 00:14:55.818 "uuid": "ce5159c7-2516-44e8-ae69-dcf2893be83e", 00:14:55.818 "strip_size_kb": 64, 00:14:55.818 "state": "online", 00:14:55.818 "raid_level": "concat", 00:14:55.818 "superblock": true, 00:14:55.818 "num_base_bdevs": 3, 00:14:55.818 "num_base_bdevs_discovered": 3, 00:14:55.818 "num_base_bdevs_operational": 3, 00:14:55.818 "base_bdevs_list": [ 00:14:55.818 { 00:14:55.818 "name": "BaseBdev1", 00:14:55.818 "uuid": "15638666-f8d9-5ad6-83a5-e44ce9311f46", 00:14:55.818 "is_configured": true, 00:14:55.818 "data_offset": 2048, 00:14:55.818 "data_size": 63488 00:14:55.818 }, 00:14:55.818 { 00:14:55.818 "name": "BaseBdev2", 00:14:55.818 "uuid": "6a0168fb-63a2-57cf-8779-6ea12239ff42", 00:14:55.818 "is_configured": true, 00:14:55.818 "data_offset": 2048, 00:14:55.818 "data_size": 63488 00:14:55.818 }, 00:14:55.818 { 00:14:55.818 "name": "BaseBdev3", 00:14:55.818 "uuid": "9935fd1a-b9ec-56ba-a388-589eea433d11", 00:14:55.818 "is_configured": true, 00:14:55.818 "data_offset": 2048, 00:14:55.818 "data_size": 63488 00:14:55.818 } 00:14:55.818 ] 00:14:55.818 }' 00:14:55.818 07:50:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.818 07:50:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.388 07:50:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:56.388 [2024-07-15 07:50:41.098760] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:56.388 [2024-07-15 07:50:41.098789] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:56.388 [2024-07-15 07:50:41.101429] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:56.388 [2024-07-15 07:50:41.101454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:56.388 [2024-07-15 07:50:41.101477] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:56.388 [2024-07-15 07:50:41.101483] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10580e0 name raid_bdev1, state offline 00:14:56.388 0 00:14:56.388 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1635676 00:14:56.388 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1635676 ']' 00:14:56.388 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1635676 00:14:56.388 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:56.388 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:56.388 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1635676 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1635676' 00:14:56.649 killing process with pid 1635676 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1635676 00:14:56.649 [2024-07-15 07:50:41.170737] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1635676 00:14:56.649 [2024-07-15 07:50:41.181984] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.3n1NaDIf1K 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:14:56.649 00:14:56.649 real 0m9.199s 00:14:56.649 user 0m15.759s 00:14:56.649 sys 0m1.077s 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:56.649 07:50:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.649 ************************************ 00:14:56.649 END TEST raid_read_error_test 00:14:56.649 ************************************ 00:14:56.649 07:50:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:56.649 07:50:41 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:14:56.649 07:50:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:56.649 07:50:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:56.649 07:50:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:56.649 ************************************ 00:14:56.649 START TEST raid_write_error_test 00:14:56.649 ************************************ 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.J2mHHTo6Nz 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1637346 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1637346 /var/tmp/spdk-raid.sock 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:56.649 07:50:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1637346 ']' 00:14:56.910 07:50:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:56.910 07:50:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:56.910 07:50:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:56.910 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:56.910 07:50:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:56.910 07:50:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.910 [2024-07-15 07:50:41.457063] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:14:56.910 [2024-07-15 07:50:41.457106] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1637346 ] 00:14:56.910 [2024-07-15 07:50:41.544887] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:56.910 [2024-07-15 07:50:41.609655] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:56.910 [2024-07-15 07:50:41.651982] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:56.910 [2024-07-15 07:50:41.652008] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:57.871 07:50:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:57.871 07:50:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:57.871 07:50:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:57.871 07:50:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:57.871 BaseBdev1_malloc 00:14:57.871 07:50:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:58.131 true 00:14:58.132 07:50:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:58.700 [2024-07-15 07:50:43.191528] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:58.700 [2024-07-15 07:50:43.191560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:58.700 [2024-07-15 07:50:43.191572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb72b50 00:14:58.700 [2024-07-15 07:50:43.191579] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:58.700 [2024-07-15 07:50:43.192907] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:58.700 [2024-07-15 07:50:43.192928] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:58.700 BaseBdev1 00:14:58.700 07:50:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:58.700 07:50:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:58.700 BaseBdev2_malloc 00:14:58.700 07:50:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:59.269 true 00:14:59.269 07:50:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:59.865 [2024-07-15 07:50:44.464732] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:59.865 [2024-07-15 07:50:44.464768] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:59.865 [2024-07-15 07:50:44.464780] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb56ea0 00:14:59.865 [2024-07-15 07:50:44.464786] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:59.865 [2024-07-15 07:50:44.465985] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:59.865 [2024-07-15 07:50:44.466006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:59.865 BaseBdev2 00:14:59.865 07:50:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:59.865 07:50:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:00.441 BaseBdev3_malloc 00:15:00.441 07:50:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:01.010 true 00:15:01.010 07:50:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:01.581 [2024-07-15 07:50:46.094772] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:01.581 [2024-07-15 07:50:46.094805] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.581 [2024-07-15 07:50:46.094817] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb5afb0 00:15:01.581 [2024-07-15 07:50:46.094824] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.581 [2024-07-15 07:50:46.096029] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.581 [2024-07-15 07:50:46.096049] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:01.581 BaseBdev3 00:15:01.581 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:02.151 [2024-07-15 07:50:46.636148] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:02.151 [2024-07-15 07:50:46.637169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:02.151 [2024-07-15 07:50:46.637222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:02.151 [2024-07-15 07:50:46.637381] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb5c0e0 00:15:02.152 [2024-07-15 07:50:46.637390] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:02.152 [2024-07-15 07:50:46.637543] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9be250 00:15:02.152 [2024-07-15 07:50:46.637659] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb5c0e0 00:15:02.152 [2024-07-15 07:50:46.637664] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb5c0e0 00:15:02.152 [2024-07-15 07:50:46.637747] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:02.152 "name": "raid_bdev1", 00:15:02.152 "uuid": "d9026e07-e138-481e-b597-7ac9a370f8fb", 00:15:02.152 "strip_size_kb": 64, 00:15:02.152 "state": "online", 00:15:02.152 "raid_level": "concat", 00:15:02.152 "superblock": true, 00:15:02.152 "num_base_bdevs": 3, 00:15:02.152 "num_base_bdevs_discovered": 3, 00:15:02.152 "num_base_bdevs_operational": 3, 00:15:02.152 "base_bdevs_list": [ 00:15:02.152 { 00:15:02.152 "name": "BaseBdev1", 00:15:02.152 "uuid": "4f03aaff-394b-5358-ab04-1eb431f63c06", 00:15:02.152 "is_configured": true, 00:15:02.152 "data_offset": 2048, 00:15:02.152 "data_size": 63488 00:15:02.152 }, 00:15:02.152 { 00:15:02.152 "name": "BaseBdev2", 00:15:02.152 "uuid": "484dc53f-ee52-57f3-b25f-d5c074a90856", 00:15:02.152 "is_configured": true, 00:15:02.152 "data_offset": 2048, 00:15:02.152 "data_size": 63488 00:15:02.152 }, 00:15:02.152 { 00:15:02.152 "name": "BaseBdev3", 00:15:02.152 "uuid": "471571e4-9a5b-555a-9351-bdbe4bdc1849", 00:15:02.152 "is_configured": true, 00:15:02.152 "data_offset": 2048, 00:15:02.152 "data_size": 63488 00:15:02.152 } 00:15:02.152 ] 00:15:02.152 }' 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:02.152 07:50:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.722 07:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:02.722 07:50:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:02.981 [2024-07-15 07:50:47.502544] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb5bda0 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.921 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:04.181 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.181 "name": "raid_bdev1", 00:15:04.181 "uuid": "d9026e07-e138-481e-b597-7ac9a370f8fb", 00:15:04.181 "strip_size_kb": 64, 00:15:04.181 "state": "online", 00:15:04.181 "raid_level": "concat", 00:15:04.181 "superblock": true, 00:15:04.181 "num_base_bdevs": 3, 00:15:04.181 "num_base_bdevs_discovered": 3, 00:15:04.181 "num_base_bdevs_operational": 3, 00:15:04.181 "base_bdevs_list": [ 00:15:04.181 { 00:15:04.181 "name": "BaseBdev1", 00:15:04.181 "uuid": "4f03aaff-394b-5358-ab04-1eb431f63c06", 00:15:04.181 "is_configured": true, 00:15:04.181 "data_offset": 2048, 00:15:04.181 "data_size": 63488 00:15:04.181 }, 00:15:04.181 { 00:15:04.181 "name": "BaseBdev2", 00:15:04.181 "uuid": "484dc53f-ee52-57f3-b25f-d5c074a90856", 00:15:04.181 "is_configured": true, 00:15:04.181 "data_offset": 2048, 00:15:04.181 "data_size": 63488 00:15:04.181 }, 00:15:04.181 { 00:15:04.181 "name": "BaseBdev3", 00:15:04.181 "uuid": "471571e4-9a5b-555a-9351-bdbe4bdc1849", 00:15:04.181 "is_configured": true, 00:15:04.181 "data_offset": 2048, 00:15:04.181 "data_size": 63488 00:15:04.181 } 00:15:04.181 ] 00:15:04.181 }' 00:15:04.181 07:50:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.181 07:50:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.751 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:04.751 [2024-07-15 07:50:49.502109] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:04.751 [2024-07-15 07:50:49.502139] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:04.751 [2024-07-15 07:50:49.504750] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:04.751 [2024-07-15 07:50:49.504777] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:04.751 [2024-07-15 07:50:49.504801] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:04.751 [2024-07-15 07:50:49.504807] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb5c0e0 name raid_bdev1, state offline 00:15:04.751 0 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1637346 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1637346 ']' 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1637346 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1637346 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1637346' 00:15:05.011 killing process with pid 1637346 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1637346 00:15:05.011 [2024-07-15 07:50:49.603934] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1637346 00:15:05.011 [2024-07-15 07:50:49.615155] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.J2mHHTo6Nz 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:15:05.011 00:15:05.011 real 0m8.359s 00:15:05.011 user 0m14.086s 00:15:05.011 sys 0m1.027s 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:05.011 07:50:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:05.011 ************************************ 00:15:05.011 END TEST raid_write_error_test 00:15:05.011 ************************************ 00:15:05.271 07:50:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:05.271 07:50:49 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:05.271 07:50:49 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:15:05.271 07:50:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:05.271 07:50:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:05.271 07:50:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:05.271 ************************************ 00:15:05.271 START TEST raid_state_function_test 00:15:05.271 ************************************ 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1638697 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1638697' 00:15:05.271 Process raid pid: 1638697 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1638697 /var/tmp/spdk-raid.sock 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1638697 ']' 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:05.271 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:05.271 07:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:05.271 [2024-07-15 07:50:49.925152] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:15:05.271 [2024-07-15 07:50:49.925285] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:05.530 [2024-07-15 07:50:50.073531] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:05.530 [2024-07-15 07:50:50.153739] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:05.530 [2024-07-15 07:50:50.197762] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:05.531 [2024-07-15 07:50:50.197785] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:06.469 07:50:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:06.469 07:50:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:06.469 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:06.728 [2024-07-15 07:50:51.245824] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:06.728 [2024-07-15 07:50:51.245851] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:06.728 [2024-07-15 07:50:51.245857] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:06.728 [2024-07-15 07:50:51.245863] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:06.728 [2024-07-15 07:50:51.245868] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:06.728 [2024-07-15 07:50:51.245873] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.728 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.295 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.295 "name": "Existed_Raid", 00:15:07.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.296 "strip_size_kb": 0, 00:15:07.296 "state": "configuring", 00:15:07.296 "raid_level": "raid1", 00:15:07.296 "superblock": false, 00:15:07.296 "num_base_bdevs": 3, 00:15:07.296 "num_base_bdevs_discovered": 0, 00:15:07.296 "num_base_bdevs_operational": 3, 00:15:07.296 "base_bdevs_list": [ 00:15:07.296 { 00:15:07.296 "name": "BaseBdev1", 00:15:07.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.296 "is_configured": false, 00:15:07.296 "data_offset": 0, 00:15:07.296 "data_size": 0 00:15:07.296 }, 00:15:07.296 { 00:15:07.296 "name": "BaseBdev2", 00:15:07.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.296 "is_configured": false, 00:15:07.296 "data_offset": 0, 00:15:07.296 "data_size": 0 00:15:07.296 }, 00:15:07.296 { 00:15:07.296 "name": "BaseBdev3", 00:15:07.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.296 "is_configured": false, 00:15:07.296 "data_offset": 0, 00:15:07.296 "data_size": 0 00:15:07.296 } 00:15:07.296 ] 00:15:07.296 }' 00:15:07.296 07:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.296 07:50:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:07.863 07:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:07.863 [2024-07-15 07:50:52.508906] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:07.863 [2024-07-15 07:50:52.508925] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b386d0 name Existed_Raid, state configuring 00:15:07.863 07:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:08.122 [2024-07-15 07:50:52.697392] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:08.122 [2024-07-15 07:50:52.697410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:08.122 [2024-07-15 07:50:52.697415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:08.122 [2024-07-15 07:50:52.697421] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:08.122 [2024-07-15 07:50:52.697426] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:08.122 [2024-07-15 07:50:52.697431] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:08.122 07:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:08.382 [2024-07-15 07:50:52.892597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:08.382 BaseBdev1 00:15:08.382 07:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:08.382 07:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:08.382 07:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:08.382 07:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:08.382 07:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:08.382 07:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:08.382 07:50:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:08.382 07:50:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:08.641 [ 00:15:08.641 { 00:15:08.641 "name": "BaseBdev1", 00:15:08.641 "aliases": [ 00:15:08.641 "eb0a8177-938a-4663-acf0-4376472f9d66" 00:15:08.641 ], 00:15:08.641 "product_name": "Malloc disk", 00:15:08.641 "block_size": 512, 00:15:08.641 "num_blocks": 65536, 00:15:08.641 "uuid": "eb0a8177-938a-4663-acf0-4376472f9d66", 00:15:08.641 "assigned_rate_limits": { 00:15:08.641 "rw_ios_per_sec": 0, 00:15:08.641 "rw_mbytes_per_sec": 0, 00:15:08.641 "r_mbytes_per_sec": 0, 00:15:08.641 "w_mbytes_per_sec": 0 00:15:08.641 }, 00:15:08.641 "claimed": true, 00:15:08.641 "claim_type": "exclusive_write", 00:15:08.641 "zoned": false, 00:15:08.641 "supported_io_types": { 00:15:08.641 "read": true, 00:15:08.641 "write": true, 00:15:08.641 "unmap": true, 00:15:08.641 "flush": true, 00:15:08.641 "reset": true, 00:15:08.641 "nvme_admin": false, 00:15:08.641 "nvme_io": false, 00:15:08.641 "nvme_io_md": false, 00:15:08.641 "write_zeroes": true, 00:15:08.641 "zcopy": true, 00:15:08.641 "get_zone_info": false, 00:15:08.641 "zone_management": false, 00:15:08.641 "zone_append": false, 00:15:08.641 "compare": false, 00:15:08.641 "compare_and_write": false, 00:15:08.641 "abort": true, 00:15:08.641 "seek_hole": false, 00:15:08.641 "seek_data": false, 00:15:08.641 "copy": true, 00:15:08.641 "nvme_iov_md": false 00:15:08.641 }, 00:15:08.641 "memory_domains": [ 00:15:08.641 { 00:15:08.641 "dma_device_id": "system", 00:15:08.641 "dma_device_type": 1 00:15:08.641 }, 00:15:08.641 { 00:15:08.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.641 "dma_device_type": 2 00:15:08.641 } 00:15:08.641 ], 00:15:08.641 "driver_specific": {} 00:15:08.641 } 00:15:08.641 ] 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.641 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:08.899 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.899 "name": "Existed_Raid", 00:15:08.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.899 "strip_size_kb": 0, 00:15:08.899 "state": "configuring", 00:15:08.899 "raid_level": "raid1", 00:15:08.899 "superblock": false, 00:15:08.899 "num_base_bdevs": 3, 00:15:08.899 "num_base_bdevs_discovered": 1, 00:15:08.899 "num_base_bdevs_operational": 3, 00:15:08.899 "base_bdevs_list": [ 00:15:08.899 { 00:15:08.899 "name": "BaseBdev1", 00:15:08.899 "uuid": "eb0a8177-938a-4663-acf0-4376472f9d66", 00:15:08.899 "is_configured": true, 00:15:08.899 "data_offset": 0, 00:15:08.899 "data_size": 65536 00:15:08.899 }, 00:15:08.899 { 00:15:08.899 "name": "BaseBdev2", 00:15:08.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.899 "is_configured": false, 00:15:08.899 "data_offset": 0, 00:15:08.899 "data_size": 0 00:15:08.899 }, 00:15:08.899 { 00:15:08.899 "name": "BaseBdev3", 00:15:08.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.899 "is_configured": false, 00:15:08.899 "data_offset": 0, 00:15:08.899 "data_size": 0 00:15:08.899 } 00:15:08.899 ] 00:15:08.899 }' 00:15:08.899 07:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.899 07:50:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.468 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:09.727 [2024-07-15 07:50:54.276096] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:09.727 [2024-07-15 07:50:54.276121] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b37fa0 name Existed_Raid, state configuring 00:15:09.727 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:09.727 [2024-07-15 07:50:54.472614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:09.727 [2024-07-15 07:50:54.473737] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:09.727 [2024-07-15 07:50:54.473761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:09.727 [2024-07-15 07:50:54.473771] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:09.727 [2024-07-15 07:50:54.473777] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.985 "name": "Existed_Raid", 00:15:09.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.985 "strip_size_kb": 0, 00:15:09.985 "state": "configuring", 00:15:09.985 "raid_level": "raid1", 00:15:09.985 "superblock": false, 00:15:09.985 "num_base_bdevs": 3, 00:15:09.985 "num_base_bdevs_discovered": 1, 00:15:09.985 "num_base_bdevs_operational": 3, 00:15:09.985 "base_bdevs_list": [ 00:15:09.985 { 00:15:09.985 "name": "BaseBdev1", 00:15:09.985 "uuid": "eb0a8177-938a-4663-acf0-4376472f9d66", 00:15:09.985 "is_configured": true, 00:15:09.985 "data_offset": 0, 00:15:09.985 "data_size": 65536 00:15:09.985 }, 00:15:09.985 { 00:15:09.985 "name": "BaseBdev2", 00:15:09.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.985 "is_configured": false, 00:15:09.985 "data_offset": 0, 00:15:09.985 "data_size": 0 00:15:09.985 }, 00:15:09.985 { 00:15:09.985 "name": "BaseBdev3", 00:15:09.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.985 "is_configured": false, 00:15:09.985 "data_offset": 0, 00:15:09.985 "data_size": 0 00:15:09.985 } 00:15:09.985 ] 00:15:09.985 }' 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.985 07:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:10.552 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:10.812 [2024-07-15 07:50:55.395664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:10.812 BaseBdev2 00:15:10.812 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:10.812 07:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:10.812 07:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:10.812 07:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:10.812 07:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:10.812 07:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:10.812 07:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:11.073 [ 00:15:11.073 { 00:15:11.073 "name": "BaseBdev2", 00:15:11.073 "aliases": [ 00:15:11.073 "d8b3e1a2-94d9-4459-a23b-b350149fff64" 00:15:11.073 ], 00:15:11.073 "product_name": "Malloc disk", 00:15:11.073 "block_size": 512, 00:15:11.073 "num_blocks": 65536, 00:15:11.073 "uuid": "d8b3e1a2-94d9-4459-a23b-b350149fff64", 00:15:11.073 "assigned_rate_limits": { 00:15:11.073 "rw_ios_per_sec": 0, 00:15:11.073 "rw_mbytes_per_sec": 0, 00:15:11.073 "r_mbytes_per_sec": 0, 00:15:11.073 "w_mbytes_per_sec": 0 00:15:11.073 }, 00:15:11.073 "claimed": true, 00:15:11.073 "claim_type": "exclusive_write", 00:15:11.073 "zoned": false, 00:15:11.073 "supported_io_types": { 00:15:11.073 "read": true, 00:15:11.073 "write": true, 00:15:11.073 "unmap": true, 00:15:11.073 "flush": true, 00:15:11.073 "reset": true, 00:15:11.073 "nvme_admin": false, 00:15:11.073 "nvme_io": false, 00:15:11.073 "nvme_io_md": false, 00:15:11.073 "write_zeroes": true, 00:15:11.073 "zcopy": true, 00:15:11.073 "get_zone_info": false, 00:15:11.073 "zone_management": false, 00:15:11.073 "zone_append": false, 00:15:11.073 "compare": false, 00:15:11.073 "compare_and_write": false, 00:15:11.073 "abort": true, 00:15:11.073 "seek_hole": false, 00:15:11.073 "seek_data": false, 00:15:11.073 "copy": true, 00:15:11.073 "nvme_iov_md": false 00:15:11.073 }, 00:15:11.073 "memory_domains": [ 00:15:11.073 { 00:15:11.073 "dma_device_id": "system", 00:15:11.073 "dma_device_type": 1 00:15:11.073 }, 00:15:11.073 { 00:15:11.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.073 "dma_device_type": 2 00:15:11.073 } 00:15:11.073 ], 00:15:11.073 "driver_specific": {} 00:15:11.073 } 00:15:11.073 ] 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.073 07:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.332 07:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.332 "name": "Existed_Raid", 00:15:11.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.332 "strip_size_kb": 0, 00:15:11.332 "state": "configuring", 00:15:11.332 "raid_level": "raid1", 00:15:11.332 "superblock": false, 00:15:11.332 "num_base_bdevs": 3, 00:15:11.332 "num_base_bdevs_discovered": 2, 00:15:11.332 "num_base_bdevs_operational": 3, 00:15:11.332 "base_bdevs_list": [ 00:15:11.332 { 00:15:11.332 "name": "BaseBdev1", 00:15:11.332 "uuid": "eb0a8177-938a-4663-acf0-4376472f9d66", 00:15:11.332 "is_configured": true, 00:15:11.332 "data_offset": 0, 00:15:11.332 "data_size": 65536 00:15:11.332 }, 00:15:11.332 { 00:15:11.332 "name": "BaseBdev2", 00:15:11.332 "uuid": "d8b3e1a2-94d9-4459-a23b-b350149fff64", 00:15:11.332 "is_configured": true, 00:15:11.332 "data_offset": 0, 00:15:11.332 "data_size": 65536 00:15:11.332 }, 00:15:11.332 { 00:15:11.332 "name": "BaseBdev3", 00:15:11.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.332 "is_configured": false, 00:15:11.332 "data_offset": 0, 00:15:11.332 "data_size": 0 00:15:11.332 } 00:15:11.332 ] 00:15:11.332 }' 00:15:11.332 07:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.332 07:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.901 07:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:12.160 [2024-07-15 07:50:56.683660] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:12.160 [2024-07-15 07:50:56.683686] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b38e90 00:15:12.160 [2024-07-15 07:50:56.683690] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:15:12.160 [2024-07-15 07:50:56.683844] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b38b60 00:15:12.160 [2024-07-15 07:50:56.683940] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b38e90 00:15:12.160 [2024-07-15 07:50:56.683946] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b38e90 00:15:12.160 [2024-07-15 07:50:56.684061] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:12.160 BaseBdev3 00:15:12.160 07:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:12.160 07:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:12.160 07:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:12.160 07:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:12.160 07:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:12.160 07:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:12.160 07:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:12.160 07:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:12.419 [ 00:15:12.419 { 00:15:12.419 "name": "BaseBdev3", 00:15:12.419 "aliases": [ 00:15:12.419 "9d89d4b5-346a-4231-941e-1bac0c9c12b3" 00:15:12.419 ], 00:15:12.419 "product_name": "Malloc disk", 00:15:12.419 "block_size": 512, 00:15:12.419 "num_blocks": 65536, 00:15:12.419 "uuid": "9d89d4b5-346a-4231-941e-1bac0c9c12b3", 00:15:12.419 "assigned_rate_limits": { 00:15:12.419 "rw_ios_per_sec": 0, 00:15:12.419 "rw_mbytes_per_sec": 0, 00:15:12.419 "r_mbytes_per_sec": 0, 00:15:12.419 "w_mbytes_per_sec": 0 00:15:12.419 }, 00:15:12.419 "claimed": true, 00:15:12.419 "claim_type": "exclusive_write", 00:15:12.419 "zoned": false, 00:15:12.419 "supported_io_types": { 00:15:12.419 "read": true, 00:15:12.419 "write": true, 00:15:12.419 "unmap": true, 00:15:12.419 "flush": true, 00:15:12.419 "reset": true, 00:15:12.419 "nvme_admin": false, 00:15:12.419 "nvme_io": false, 00:15:12.419 "nvme_io_md": false, 00:15:12.419 "write_zeroes": true, 00:15:12.419 "zcopy": true, 00:15:12.419 "get_zone_info": false, 00:15:12.419 "zone_management": false, 00:15:12.419 "zone_append": false, 00:15:12.419 "compare": false, 00:15:12.419 "compare_and_write": false, 00:15:12.419 "abort": true, 00:15:12.419 "seek_hole": false, 00:15:12.419 "seek_data": false, 00:15:12.419 "copy": true, 00:15:12.419 "nvme_iov_md": false 00:15:12.419 }, 00:15:12.419 "memory_domains": [ 00:15:12.419 { 00:15:12.419 "dma_device_id": "system", 00:15:12.419 "dma_device_type": 1 00:15:12.419 }, 00:15:12.419 { 00:15:12.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.419 "dma_device_type": 2 00:15:12.419 } 00:15:12.419 ], 00:15:12.419 "driver_specific": {} 00:15:12.419 } 00:15:12.419 ] 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.419 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:12.678 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.678 "name": "Existed_Raid", 00:15:12.678 "uuid": "a5f1c505-cd6a-4462-a6d7-e5c2be9a1807", 00:15:12.678 "strip_size_kb": 0, 00:15:12.678 "state": "online", 00:15:12.678 "raid_level": "raid1", 00:15:12.678 "superblock": false, 00:15:12.678 "num_base_bdevs": 3, 00:15:12.678 "num_base_bdevs_discovered": 3, 00:15:12.678 "num_base_bdevs_operational": 3, 00:15:12.678 "base_bdevs_list": [ 00:15:12.678 { 00:15:12.678 "name": "BaseBdev1", 00:15:12.678 "uuid": "eb0a8177-938a-4663-acf0-4376472f9d66", 00:15:12.678 "is_configured": true, 00:15:12.678 "data_offset": 0, 00:15:12.678 "data_size": 65536 00:15:12.678 }, 00:15:12.678 { 00:15:12.678 "name": "BaseBdev2", 00:15:12.678 "uuid": "d8b3e1a2-94d9-4459-a23b-b350149fff64", 00:15:12.678 "is_configured": true, 00:15:12.678 "data_offset": 0, 00:15:12.678 "data_size": 65536 00:15:12.678 }, 00:15:12.678 { 00:15:12.678 "name": "BaseBdev3", 00:15:12.678 "uuid": "9d89d4b5-346a-4231-941e-1bac0c9c12b3", 00:15:12.678 "is_configured": true, 00:15:12.678 "data_offset": 0, 00:15:12.678 "data_size": 65536 00:15:12.678 } 00:15:12.678 ] 00:15:12.678 }' 00:15:12.678 07:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.678 07:50:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:13.617 [2024-07-15 07:50:58.352116] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:13.617 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:13.617 "name": "Existed_Raid", 00:15:13.617 "aliases": [ 00:15:13.617 "a5f1c505-cd6a-4462-a6d7-e5c2be9a1807" 00:15:13.617 ], 00:15:13.617 "product_name": "Raid Volume", 00:15:13.617 "block_size": 512, 00:15:13.617 "num_blocks": 65536, 00:15:13.617 "uuid": "a5f1c505-cd6a-4462-a6d7-e5c2be9a1807", 00:15:13.617 "assigned_rate_limits": { 00:15:13.617 "rw_ios_per_sec": 0, 00:15:13.617 "rw_mbytes_per_sec": 0, 00:15:13.617 "r_mbytes_per_sec": 0, 00:15:13.617 "w_mbytes_per_sec": 0 00:15:13.617 }, 00:15:13.617 "claimed": false, 00:15:13.617 "zoned": false, 00:15:13.617 "supported_io_types": { 00:15:13.617 "read": true, 00:15:13.617 "write": true, 00:15:13.617 "unmap": false, 00:15:13.617 "flush": false, 00:15:13.617 "reset": true, 00:15:13.617 "nvme_admin": false, 00:15:13.617 "nvme_io": false, 00:15:13.617 "nvme_io_md": false, 00:15:13.617 "write_zeroes": true, 00:15:13.617 "zcopy": false, 00:15:13.617 "get_zone_info": false, 00:15:13.617 "zone_management": false, 00:15:13.617 "zone_append": false, 00:15:13.617 "compare": false, 00:15:13.617 "compare_and_write": false, 00:15:13.617 "abort": false, 00:15:13.617 "seek_hole": false, 00:15:13.617 "seek_data": false, 00:15:13.617 "copy": false, 00:15:13.617 "nvme_iov_md": false 00:15:13.617 }, 00:15:13.617 "memory_domains": [ 00:15:13.617 { 00:15:13.617 "dma_device_id": "system", 00:15:13.617 "dma_device_type": 1 00:15:13.617 }, 00:15:13.617 { 00:15:13.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.617 "dma_device_type": 2 00:15:13.617 }, 00:15:13.617 { 00:15:13.617 "dma_device_id": "system", 00:15:13.617 "dma_device_type": 1 00:15:13.617 }, 00:15:13.617 { 00:15:13.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.617 "dma_device_type": 2 00:15:13.617 }, 00:15:13.617 { 00:15:13.617 "dma_device_id": "system", 00:15:13.617 "dma_device_type": 1 00:15:13.617 }, 00:15:13.617 { 00:15:13.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.617 "dma_device_type": 2 00:15:13.617 } 00:15:13.617 ], 00:15:13.617 "driver_specific": { 00:15:13.617 "raid": { 00:15:13.617 "uuid": "a5f1c505-cd6a-4462-a6d7-e5c2be9a1807", 00:15:13.617 "strip_size_kb": 0, 00:15:13.617 "state": "online", 00:15:13.617 "raid_level": "raid1", 00:15:13.617 "superblock": false, 00:15:13.617 "num_base_bdevs": 3, 00:15:13.617 "num_base_bdevs_discovered": 3, 00:15:13.617 "num_base_bdevs_operational": 3, 00:15:13.617 "base_bdevs_list": [ 00:15:13.617 { 00:15:13.617 "name": "BaseBdev1", 00:15:13.617 "uuid": "eb0a8177-938a-4663-acf0-4376472f9d66", 00:15:13.617 "is_configured": true, 00:15:13.617 "data_offset": 0, 00:15:13.617 "data_size": 65536 00:15:13.617 }, 00:15:13.617 { 00:15:13.617 "name": "BaseBdev2", 00:15:13.617 "uuid": "d8b3e1a2-94d9-4459-a23b-b350149fff64", 00:15:13.617 "is_configured": true, 00:15:13.617 "data_offset": 0, 00:15:13.617 "data_size": 65536 00:15:13.617 }, 00:15:13.617 { 00:15:13.617 "name": "BaseBdev3", 00:15:13.617 "uuid": "9d89d4b5-346a-4231-941e-1bac0c9c12b3", 00:15:13.617 "is_configured": true, 00:15:13.617 "data_offset": 0, 00:15:13.617 "data_size": 65536 00:15:13.617 } 00:15:13.617 ] 00:15:13.617 } 00:15:13.617 } 00:15:13.617 }' 00:15:13.877 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:13.877 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:13.877 BaseBdev2 00:15:13.877 BaseBdev3' 00:15:13.877 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:13.877 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:13.877 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.136 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:14.136 "name": "BaseBdev1", 00:15:14.136 "aliases": [ 00:15:14.136 "eb0a8177-938a-4663-acf0-4376472f9d66" 00:15:14.136 ], 00:15:14.136 "product_name": "Malloc disk", 00:15:14.136 "block_size": 512, 00:15:14.137 "num_blocks": 65536, 00:15:14.137 "uuid": "eb0a8177-938a-4663-acf0-4376472f9d66", 00:15:14.137 "assigned_rate_limits": { 00:15:14.137 "rw_ios_per_sec": 0, 00:15:14.137 "rw_mbytes_per_sec": 0, 00:15:14.137 "r_mbytes_per_sec": 0, 00:15:14.137 "w_mbytes_per_sec": 0 00:15:14.137 }, 00:15:14.137 "claimed": true, 00:15:14.137 "claim_type": "exclusive_write", 00:15:14.137 "zoned": false, 00:15:14.137 "supported_io_types": { 00:15:14.137 "read": true, 00:15:14.137 "write": true, 00:15:14.137 "unmap": true, 00:15:14.137 "flush": true, 00:15:14.137 "reset": true, 00:15:14.137 "nvme_admin": false, 00:15:14.137 "nvme_io": false, 00:15:14.137 "nvme_io_md": false, 00:15:14.137 "write_zeroes": true, 00:15:14.137 "zcopy": true, 00:15:14.137 "get_zone_info": false, 00:15:14.137 "zone_management": false, 00:15:14.137 "zone_append": false, 00:15:14.137 "compare": false, 00:15:14.137 "compare_and_write": false, 00:15:14.137 "abort": true, 00:15:14.137 "seek_hole": false, 00:15:14.137 "seek_data": false, 00:15:14.137 "copy": true, 00:15:14.137 "nvme_iov_md": false 00:15:14.137 }, 00:15:14.137 "memory_domains": [ 00:15:14.137 { 00:15:14.137 "dma_device_id": "system", 00:15:14.137 "dma_device_type": 1 00:15:14.137 }, 00:15:14.137 { 00:15:14.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.137 "dma_device_type": 2 00:15:14.137 } 00:15:14.137 ], 00:15:14.137 "driver_specific": {} 00:15:14.137 }' 00:15:14.137 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.137 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.137 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:14.137 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.137 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.137 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:14.137 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.396 07:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.396 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:14.396 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.396 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.396 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:14.396 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:14.396 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:14.396 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.656 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:14.656 "name": "BaseBdev2", 00:15:14.656 "aliases": [ 00:15:14.656 "d8b3e1a2-94d9-4459-a23b-b350149fff64" 00:15:14.656 ], 00:15:14.656 "product_name": "Malloc disk", 00:15:14.656 "block_size": 512, 00:15:14.656 "num_blocks": 65536, 00:15:14.656 "uuid": "d8b3e1a2-94d9-4459-a23b-b350149fff64", 00:15:14.656 "assigned_rate_limits": { 00:15:14.656 "rw_ios_per_sec": 0, 00:15:14.656 "rw_mbytes_per_sec": 0, 00:15:14.656 "r_mbytes_per_sec": 0, 00:15:14.656 "w_mbytes_per_sec": 0 00:15:14.656 }, 00:15:14.656 "claimed": true, 00:15:14.656 "claim_type": "exclusive_write", 00:15:14.656 "zoned": false, 00:15:14.656 "supported_io_types": { 00:15:14.656 "read": true, 00:15:14.656 "write": true, 00:15:14.656 "unmap": true, 00:15:14.656 "flush": true, 00:15:14.656 "reset": true, 00:15:14.656 "nvme_admin": false, 00:15:14.656 "nvme_io": false, 00:15:14.656 "nvme_io_md": false, 00:15:14.656 "write_zeroes": true, 00:15:14.656 "zcopy": true, 00:15:14.656 "get_zone_info": false, 00:15:14.656 "zone_management": false, 00:15:14.656 "zone_append": false, 00:15:14.656 "compare": false, 00:15:14.656 "compare_and_write": false, 00:15:14.656 "abort": true, 00:15:14.656 "seek_hole": false, 00:15:14.656 "seek_data": false, 00:15:14.656 "copy": true, 00:15:14.656 "nvme_iov_md": false 00:15:14.656 }, 00:15:14.656 "memory_domains": [ 00:15:14.656 { 00:15:14.656 "dma_device_id": "system", 00:15:14.656 "dma_device_type": 1 00:15:14.656 }, 00:15:14.656 { 00:15:14.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.656 "dma_device_type": 2 00:15:14.656 } 00:15:14.656 ], 00:15:14.656 "driver_specific": {} 00:15:14.656 }' 00:15:14.656 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.656 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.917 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:15.177 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:15.177 "name": "BaseBdev3", 00:15:15.177 "aliases": [ 00:15:15.177 "9d89d4b5-346a-4231-941e-1bac0c9c12b3" 00:15:15.177 ], 00:15:15.177 "product_name": "Malloc disk", 00:15:15.177 "block_size": 512, 00:15:15.177 "num_blocks": 65536, 00:15:15.177 "uuid": "9d89d4b5-346a-4231-941e-1bac0c9c12b3", 00:15:15.177 "assigned_rate_limits": { 00:15:15.177 "rw_ios_per_sec": 0, 00:15:15.177 "rw_mbytes_per_sec": 0, 00:15:15.177 "r_mbytes_per_sec": 0, 00:15:15.177 "w_mbytes_per_sec": 0 00:15:15.177 }, 00:15:15.177 "claimed": true, 00:15:15.177 "claim_type": "exclusive_write", 00:15:15.177 "zoned": false, 00:15:15.177 "supported_io_types": { 00:15:15.177 "read": true, 00:15:15.177 "write": true, 00:15:15.177 "unmap": true, 00:15:15.177 "flush": true, 00:15:15.177 "reset": true, 00:15:15.177 "nvme_admin": false, 00:15:15.177 "nvme_io": false, 00:15:15.177 "nvme_io_md": false, 00:15:15.177 "write_zeroes": true, 00:15:15.177 "zcopy": true, 00:15:15.177 "get_zone_info": false, 00:15:15.177 "zone_management": false, 00:15:15.177 "zone_append": false, 00:15:15.177 "compare": false, 00:15:15.177 "compare_and_write": false, 00:15:15.177 "abort": true, 00:15:15.177 "seek_hole": false, 00:15:15.177 "seek_data": false, 00:15:15.177 "copy": true, 00:15:15.177 "nvme_iov_md": false 00:15:15.177 }, 00:15:15.177 "memory_domains": [ 00:15:15.177 { 00:15:15.177 "dma_device_id": "system", 00:15:15.177 "dma_device_type": 1 00:15:15.177 }, 00:15:15.177 { 00:15:15.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.177 "dma_device_type": 2 00:15:15.177 } 00:15:15.177 ], 00:15:15.177 "driver_specific": {} 00:15:15.177 }' 00:15:15.177 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.177 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.437 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:15.437 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.437 07:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.437 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.437 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.437 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.437 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.437 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.437 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:15.697 [2024-07-15 07:51:00.393106] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.697 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.956 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.956 "name": "Existed_Raid", 00:15:15.956 "uuid": "a5f1c505-cd6a-4462-a6d7-e5c2be9a1807", 00:15:15.956 "strip_size_kb": 0, 00:15:15.956 "state": "online", 00:15:15.956 "raid_level": "raid1", 00:15:15.956 "superblock": false, 00:15:15.956 "num_base_bdevs": 3, 00:15:15.957 "num_base_bdevs_discovered": 2, 00:15:15.957 "num_base_bdevs_operational": 2, 00:15:15.957 "base_bdevs_list": [ 00:15:15.957 { 00:15:15.957 "name": null, 00:15:15.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.957 "is_configured": false, 00:15:15.957 "data_offset": 0, 00:15:15.957 "data_size": 65536 00:15:15.957 }, 00:15:15.957 { 00:15:15.957 "name": "BaseBdev2", 00:15:15.957 "uuid": "d8b3e1a2-94d9-4459-a23b-b350149fff64", 00:15:15.957 "is_configured": true, 00:15:15.957 "data_offset": 0, 00:15:15.957 "data_size": 65536 00:15:15.957 }, 00:15:15.957 { 00:15:15.957 "name": "BaseBdev3", 00:15:15.957 "uuid": "9d89d4b5-346a-4231-941e-1bac0c9c12b3", 00:15:15.957 "is_configured": true, 00:15:15.957 "data_offset": 0, 00:15:15.957 "data_size": 65536 00:15:15.957 } 00:15:15.957 ] 00:15:15.957 }' 00:15:15.957 07:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.957 07:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.525 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:16.525 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:16.525 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.525 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:16.785 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:16.785 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:16.785 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:16.785 [2024-07-15 07:51:01.511936] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:16.785 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:16.785 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:16.785 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.785 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:17.058 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:17.058 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:17.058 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:17.368 [2024-07-15 07:51:01.894741] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:17.368 [2024-07-15 07:51:01.894793] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:17.368 [2024-07-15 07:51:01.900691] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:17.368 [2024-07-15 07:51:01.900720] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:17.368 [2024-07-15 07:51:01.900726] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b38e90 name Existed_Raid, state offline 00:15:17.368 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:17.368 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:17.368 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.368 07:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:17.368 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:17.368 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:17.368 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:17.368 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:17.368 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:17.368 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:17.629 BaseBdev2 00:15:17.629 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:17.629 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:17.629 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:17.629 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:17.629 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:17.629 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:17.629 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:17.888 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:18.147 [ 00:15:18.147 { 00:15:18.147 "name": "BaseBdev2", 00:15:18.147 "aliases": [ 00:15:18.147 "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b" 00:15:18.147 ], 00:15:18.148 "product_name": "Malloc disk", 00:15:18.148 "block_size": 512, 00:15:18.148 "num_blocks": 65536, 00:15:18.148 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:18.148 "assigned_rate_limits": { 00:15:18.148 "rw_ios_per_sec": 0, 00:15:18.148 "rw_mbytes_per_sec": 0, 00:15:18.148 "r_mbytes_per_sec": 0, 00:15:18.148 "w_mbytes_per_sec": 0 00:15:18.148 }, 00:15:18.148 "claimed": false, 00:15:18.148 "zoned": false, 00:15:18.148 "supported_io_types": { 00:15:18.148 "read": true, 00:15:18.148 "write": true, 00:15:18.148 "unmap": true, 00:15:18.148 "flush": true, 00:15:18.148 "reset": true, 00:15:18.148 "nvme_admin": false, 00:15:18.148 "nvme_io": false, 00:15:18.148 "nvme_io_md": false, 00:15:18.148 "write_zeroes": true, 00:15:18.148 "zcopy": true, 00:15:18.148 "get_zone_info": false, 00:15:18.148 "zone_management": false, 00:15:18.148 "zone_append": false, 00:15:18.148 "compare": false, 00:15:18.148 "compare_and_write": false, 00:15:18.148 "abort": true, 00:15:18.148 "seek_hole": false, 00:15:18.148 "seek_data": false, 00:15:18.148 "copy": true, 00:15:18.148 "nvme_iov_md": false 00:15:18.148 }, 00:15:18.148 "memory_domains": [ 00:15:18.148 { 00:15:18.148 "dma_device_id": "system", 00:15:18.148 "dma_device_type": 1 00:15:18.148 }, 00:15:18.148 { 00:15:18.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.148 "dma_device_type": 2 00:15:18.148 } 00:15:18.148 ], 00:15:18.148 "driver_specific": {} 00:15:18.148 } 00:15:18.148 ] 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:18.148 BaseBdev3 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:18.148 07:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.408 07:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:18.668 [ 00:15:18.668 { 00:15:18.668 "name": "BaseBdev3", 00:15:18.668 "aliases": [ 00:15:18.668 "535a784a-8300-4614-b996-5b2c8a03bfd8" 00:15:18.668 ], 00:15:18.668 "product_name": "Malloc disk", 00:15:18.668 "block_size": 512, 00:15:18.668 "num_blocks": 65536, 00:15:18.668 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:18.668 "assigned_rate_limits": { 00:15:18.668 "rw_ios_per_sec": 0, 00:15:18.668 "rw_mbytes_per_sec": 0, 00:15:18.668 "r_mbytes_per_sec": 0, 00:15:18.668 "w_mbytes_per_sec": 0 00:15:18.668 }, 00:15:18.668 "claimed": false, 00:15:18.668 "zoned": false, 00:15:18.668 "supported_io_types": { 00:15:18.668 "read": true, 00:15:18.668 "write": true, 00:15:18.668 "unmap": true, 00:15:18.668 "flush": true, 00:15:18.668 "reset": true, 00:15:18.668 "nvme_admin": false, 00:15:18.668 "nvme_io": false, 00:15:18.668 "nvme_io_md": false, 00:15:18.668 "write_zeroes": true, 00:15:18.668 "zcopy": true, 00:15:18.668 "get_zone_info": false, 00:15:18.668 "zone_management": false, 00:15:18.668 "zone_append": false, 00:15:18.668 "compare": false, 00:15:18.668 "compare_and_write": false, 00:15:18.668 "abort": true, 00:15:18.668 "seek_hole": false, 00:15:18.668 "seek_data": false, 00:15:18.668 "copy": true, 00:15:18.668 "nvme_iov_md": false 00:15:18.668 }, 00:15:18.668 "memory_domains": [ 00:15:18.668 { 00:15:18.668 "dma_device_id": "system", 00:15:18.668 "dma_device_type": 1 00:15:18.668 }, 00:15:18.668 { 00:15:18.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.668 "dma_device_type": 2 00:15:18.668 } 00:15:18.668 ], 00:15:18.668 "driver_specific": {} 00:15:18.668 } 00:15:18.668 ] 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:18.668 [2024-07-15 07:51:03.390363] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:18.668 [2024-07-15 07:51:03.390390] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:18.668 [2024-07-15 07:51:03.390402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:18.668 [2024-07-15 07:51:03.391433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.668 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.928 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.928 "name": "Existed_Raid", 00:15:18.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.928 "strip_size_kb": 0, 00:15:18.928 "state": "configuring", 00:15:18.928 "raid_level": "raid1", 00:15:18.928 "superblock": false, 00:15:18.928 "num_base_bdevs": 3, 00:15:18.928 "num_base_bdevs_discovered": 2, 00:15:18.928 "num_base_bdevs_operational": 3, 00:15:18.928 "base_bdevs_list": [ 00:15:18.928 { 00:15:18.928 "name": "BaseBdev1", 00:15:18.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.928 "is_configured": false, 00:15:18.928 "data_offset": 0, 00:15:18.928 "data_size": 0 00:15:18.928 }, 00:15:18.928 { 00:15:18.928 "name": "BaseBdev2", 00:15:18.928 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:18.928 "is_configured": true, 00:15:18.928 "data_offset": 0, 00:15:18.928 "data_size": 65536 00:15:18.928 }, 00:15:18.928 { 00:15:18.928 "name": "BaseBdev3", 00:15:18.928 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:18.928 "is_configured": true, 00:15:18.928 "data_offset": 0, 00:15:18.928 "data_size": 65536 00:15:18.928 } 00:15:18.928 ] 00:15:18.928 }' 00:15:18.928 07:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.928 07:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:19.496 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:19.757 [2024-07-15 07:51:04.300656] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.757 "name": "Existed_Raid", 00:15:19.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.757 "strip_size_kb": 0, 00:15:19.757 "state": "configuring", 00:15:19.757 "raid_level": "raid1", 00:15:19.757 "superblock": false, 00:15:19.757 "num_base_bdevs": 3, 00:15:19.757 "num_base_bdevs_discovered": 1, 00:15:19.757 "num_base_bdevs_operational": 3, 00:15:19.757 "base_bdevs_list": [ 00:15:19.757 { 00:15:19.757 "name": "BaseBdev1", 00:15:19.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.757 "is_configured": false, 00:15:19.757 "data_offset": 0, 00:15:19.757 "data_size": 0 00:15:19.757 }, 00:15:19.757 { 00:15:19.757 "name": null, 00:15:19.757 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:19.757 "is_configured": false, 00:15:19.757 "data_offset": 0, 00:15:19.757 "data_size": 65536 00:15:19.757 }, 00:15:19.757 { 00:15:19.757 "name": "BaseBdev3", 00:15:19.757 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:19.757 "is_configured": true, 00:15:19.757 "data_offset": 0, 00:15:19.757 "data_size": 65536 00:15:19.757 } 00:15:19.757 ] 00:15:19.757 }' 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.757 07:51:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.327 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.327 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:20.587 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:20.587 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:20.847 [2024-07-15 07:51:05.420615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:20.847 BaseBdev1 00:15:20.847 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:20.847 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:20.847 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:20.847 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:20.847 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:20.847 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:20.847 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:21.107 [ 00:15:21.107 { 00:15:21.107 "name": "BaseBdev1", 00:15:21.107 "aliases": [ 00:15:21.107 "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70" 00:15:21.107 ], 00:15:21.107 "product_name": "Malloc disk", 00:15:21.107 "block_size": 512, 00:15:21.107 "num_blocks": 65536, 00:15:21.107 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:21.107 "assigned_rate_limits": { 00:15:21.107 "rw_ios_per_sec": 0, 00:15:21.107 "rw_mbytes_per_sec": 0, 00:15:21.107 "r_mbytes_per_sec": 0, 00:15:21.107 "w_mbytes_per_sec": 0 00:15:21.107 }, 00:15:21.107 "claimed": true, 00:15:21.107 "claim_type": "exclusive_write", 00:15:21.107 "zoned": false, 00:15:21.107 "supported_io_types": { 00:15:21.107 "read": true, 00:15:21.107 "write": true, 00:15:21.107 "unmap": true, 00:15:21.107 "flush": true, 00:15:21.107 "reset": true, 00:15:21.107 "nvme_admin": false, 00:15:21.107 "nvme_io": false, 00:15:21.107 "nvme_io_md": false, 00:15:21.107 "write_zeroes": true, 00:15:21.107 "zcopy": true, 00:15:21.107 "get_zone_info": false, 00:15:21.107 "zone_management": false, 00:15:21.107 "zone_append": false, 00:15:21.107 "compare": false, 00:15:21.107 "compare_and_write": false, 00:15:21.107 "abort": true, 00:15:21.107 "seek_hole": false, 00:15:21.107 "seek_data": false, 00:15:21.107 "copy": true, 00:15:21.107 "nvme_iov_md": false 00:15:21.107 }, 00:15:21.107 "memory_domains": [ 00:15:21.107 { 00:15:21.107 "dma_device_id": "system", 00:15:21.107 "dma_device_type": 1 00:15:21.107 }, 00:15:21.107 { 00:15:21.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.107 "dma_device_type": 2 00:15:21.107 } 00:15:21.107 ], 00:15:21.107 "driver_specific": {} 00:15:21.107 } 00:15:21.107 ] 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.107 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.367 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.367 "name": "Existed_Raid", 00:15:21.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.367 "strip_size_kb": 0, 00:15:21.367 "state": "configuring", 00:15:21.367 "raid_level": "raid1", 00:15:21.367 "superblock": false, 00:15:21.367 "num_base_bdevs": 3, 00:15:21.367 "num_base_bdevs_discovered": 2, 00:15:21.367 "num_base_bdevs_operational": 3, 00:15:21.367 "base_bdevs_list": [ 00:15:21.367 { 00:15:21.367 "name": "BaseBdev1", 00:15:21.367 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:21.367 "is_configured": true, 00:15:21.367 "data_offset": 0, 00:15:21.367 "data_size": 65536 00:15:21.367 }, 00:15:21.367 { 00:15:21.367 "name": null, 00:15:21.367 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:21.367 "is_configured": false, 00:15:21.367 "data_offset": 0, 00:15:21.367 "data_size": 65536 00:15:21.367 }, 00:15:21.367 { 00:15:21.367 "name": "BaseBdev3", 00:15:21.367 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:21.367 "is_configured": true, 00:15:21.367 "data_offset": 0, 00:15:21.367 "data_size": 65536 00:15:21.367 } 00:15:21.367 ] 00:15:21.367 }' 00:15:21.367 07:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.367 07:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.938 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.938 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:22.198 [2024-07-15 07:51:06.916407] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.198 07:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.458 07:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.458 "name": "Existed_Raid", 00:15:22.458 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.458 "strip_size_kb": 0, 00:15:22.458 "state": "configuring", 00:15:22.458 "raid_level": "raid1", 00:15:22.458 "superblock": false, 00:15:22.458 "num_base_bdevs": 3, 00:15:22.458 "num_base_bdevs_discovered": 1, 00:15:22.458 "num_base_bdevs_operational": 3, 00:15:22.458 "base_bdevs_list": [ 00:15:22.458 { 00:15:22.458 "name": "BaseBdev1", 00:15:22.458 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:22.458 "is_configured": true, 00:15:22.458 "data_offset": 0, 00:15:22.458 "data_size": 65536 00:15:22.458 }, 00:15:22.458 { 00:15:22.458 "name": null, 00:15:22.458 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:22.458 "is_configured": false, 00:15:22.458 "data_offset": 0, 00:15:22.458 "data_size": 65536 00:15:22.458 }, 00:15:22.458 { 00:15:22.458 "name": null, 00:15:22.458 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:22.458 "is_configured": false, 00:15:22.458 "data_offset": 0, 00:15:22.458 "data_size": 65536 00:15:22.458 } 00:15:22.458 ] 00:15:22.458 }' 00:15:22.458 07:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.458 07:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.028 07:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.028 07:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:23.289 07:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:23.289 07:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:23.289 [2024-07-15 07:51:08.007166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.289 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.549 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.549 "name": "Existed_Raid", 00:15:23.549 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.549 "strip_size_kb": 0, 00:15:23.549 "state": "configuring", 00:15:23.549 "raid_level": "raid1", 00:15:23.549 "superblock": false, 00:15:23.549 "num_base_bdevs": 3, 00:15:23.549 "num_base_bdevs_discovered": 2, 00:15:23.549 "num_base_bdevs_operational": 3, 00:15:23.549 "base_bdevs_list": [ 00:15:23.549 { 00:15:23.549 "name": "BaseBdev1", 00:15:23.549 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:23.549 "is_configured": true, 00:15:23.549 "data_offset": 0, 00:15:23.549 "data_size": 65536 00:15:23.549 }, 00:15:23.549 { 00:15:23.549 "name": null, 00:15:23.549 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:23.549 "is_configured": false, 00:15:23.549 "data_offset": 0, 00:15:23.549 "data_size": 65536 00:15:23.549 }, 00:15:23.549 { 00:15:23.549 "name": "BaseBdev3", 00:15:23.549 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:23.549 "is_configured": true, 00:15:23.549 "data_offset": 0, 00:15:23.549 "data_size": 65536 00:15:23.549 } 00:15:23.549 ] 00:15:23.549 }' 00:15:23.549 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.549 07:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.118 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.118 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:24.377 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:24.377 07:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:24.377 [2024-07-15 07:51:09.117985] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.636 "name": "Existed_Raid", 00:15:24.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.636 "strip_size_kb": 0, 00:15:24.636 "state": "configuring", 00:15:24.636 "raid_level": "raid1", 00:15:24.636 "superblock": false, 00:15:24.636 "num_base_bdevs": 3, 00:15:24.636 "num_base_bdevs_discovered": 1, 00:15:24.636 "num_base_bdevs_operational": 3, 00:15:24.636 "base_bdevs_list": [ 00:15:24.636 { 00:15:24.636 "name": null, 00:15:24.636 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:24.636 "is_configured": false, 00:15:24.636 "data_offset": 0, 00:15:24.636 "data_size": 65536 00:15:24.636 }, 00:15:24.636 { 00:15:24.636 "name": null, 00:15:24.636 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:24.636 "is_configured": false, 00:15:24.636 "data_offset": 0, 00:15:24.636 "data_size": 65536 00:15:24.636 }, 00:15:24.636 { 00:15:24.636 "name": "BaseBdev3", 00:15:24.636 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:24.636 "is_configured": true, 00:15:24.636 "data_offset": 0, 00:15:24.636 "data_size": 65536 00:15:24.636 } 00:15:24.636 ] 00:15:24.636 }' 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.636 07:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:25.207 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:25.207 07:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.466 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:25.466 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:25.726 [2024-07-15 07:51:10.238607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.726 "name": "Existed_Raid", 00:15:25.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.726 "strip_size_kb": 0, 00:15:25.726 "state": "configuring", 00:15:25.726 "raid_level": "raid1", 00:15:25.726 "superblock": false, 00:15:25.726 "num_base_bdevs": 3, 00:15:25.726 "num_base_bdevs_discovered": 2, 00:15:25.726 "num_base_bdevs_operational": 3, 00:15:25.726 "base_bdevs_list": [ 00:15:25.726 { 00:15:25.726 "name": null, 00:15:25.726 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:25.726 "is_configured": false, 00:15:25.726 "data_offset": 0, 00:15:25.726 "data_size": 65536 00:15:25.726 }, 00:15:25.726 { 00:15:25.726 "name": "BaseBdev2", 00:15:25.726 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:25.726 "is_configured": true, 00:15:25.726 "data_offset": 0, 00:15:25.726 "data_size": 65536 00:15:25.726 }, 00:15:25.726 { 00:15:25.726 "name": "BaseBdev3", 00:15:25.726 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:25.726 "is_configured": true, 00:15:25.726 "data_offset": 0, 00:15:25.726 "data_size": 65536 00:15:25.726 } 00:15:25.726 ] 00:15:25.726 }' 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.726 07:51:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:26.295 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:26.295 07:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.555 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:26.555 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.555 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:26.815 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70 00:15:26.815 [2024-07-15 07:51:11.554917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:26.815 [2024-07-15 07:51:11.554943] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b3a630 00:15:26.815 [2024-07-15 07:51:11.554948] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:15:26.815 [2024-07-15 07:51:11.555095] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b386a0 00:15:26.815 [2024-07-15 07:51:11.555188] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b3a630 00:15:26.815 [2024-07-15 07:51:11.555194] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b3a630 00:15:26.815 [2024-07-15 07:51:11.555312] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:26.815 NewBaseBdev 00:15:26.815 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:26.815 07:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:26.815 07:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:26.815 07:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:26.815 07:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:26.815 07:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:26.815 07:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:27.076 07:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:27.337 [ 00:15:27.337 { 00:15:27.337 "name": "NewBaseBdev", 00:15:27.337 "aliases": [ 00:15:27.337 "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70" 00:15:27.337 ], 00:15:27.337 "product_name": "Malloc disk", 00:15:27.337 "block_size": 512, 00:15:27.337 "num_blocks": 65536, 00:15:27.337 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:27.337 "assigned_rate_limits": { 00:15:27.337 "rw_ios_per_sec": 0, 00:15:27.337 "rw_mbytes_per_sec": 0, 00:15:27.337 "r_mbytes_per_sec": 0, 00:15:27.337 "w_mbytes_per_sec": 0 00:15:27.337 }, 00:15:27.337 "claimed": true, 00:15:27.337 "claim_type": "exclusive_write", 00:15:27.337 "zoned": false, 00:15:27.337 "supported_io_types": { 00:15:27.337 "read": true, 00:15:27.337 "write": true, 00:15:27.337 "unmap": true, 00:15:27.337 "flush": true, 00:15:27.337 "reset": true, 00:15:27.337 "nvme_admin": false, 00:15:27.337 "nvme_io": false, 00:15:27.337 "nvme_io_md": false, 00:15:27.337 "write_zeroes": true, 00:15:27.337 "zcopy": true, 00:15:27.337 "get_zone_info": false, 00:15:27.337 "zone_management": false, 00:15:27.338 "zone_append": false, 00:15:27.338 "compare": false, 00:15:27.338 "compare_and_write": false, 00:15:27.338 "abort": true, 00:15:27.338 "seek_hole": false, 00:15:27.338 "seek_data": false, 00:15:27.338 "copy": true, 00:15:27.338 "nvme_iov_md": false 00:15:27.338 }, 00:15:27.338 "memory_domains": [ 00:15:27.338 { 00:15:27.338 "dma_device_id": "system", 00:15:27.338 "dma_device_type": 1 00:15:27.338 }, 00:15:27.338 { 00:15:27.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.338 "dma_device_type": 2 00:15:27.338 } 00:15:27.338 ], 00:15:27.338 "driver_specific": {} 00:15:27.338 } 00:15:27.338 ] 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.338 07:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.599 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.599 "name": "Existed_Raid", 00:15:27.599 "uuid": "50626869-866d-419e-a28b-14b092a783f4", 00:15:27.600 "strip_size_kb": 0, 00:15:27.600 "state": "online", 00:15:27.600 "raid_level": "raid1", 00:15:27.600 "superblock": false, 00:15:27.600 "num_base_bdevs": 3, 00:15:27.600 "num_base_bdevs_discovered": 3, 00:15:27.600 "num_base_bdevs_operational": 3, 00:15:27.600 "base_bdevs_list": [ 00:15:27.600 { 00:15:27.600 "name": "NewBaseBdev", 00:15:27.600 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:27.600 "is_configured": true, 00:15:27.600 "data_offset": 0, 00:15:27.600 "data_size": 65536 00:15:27.600 }, 00:15:27.600 { 00:15:27.600 "name": "BaseBdev2", 00:15:27.600 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:27.600 "is_configured": true, 00:15:27.600 "data_offset": 0, 00:15:27.600 "data_size": 65536 00:15:27.600 }, 00:15:27.600 { 00:15:27.600 "name": "BaseBdev3", 00:15:27.600 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:27.600 "is_configured": true, 00:15:27.600 "data_offset": 0, 00:15:27.600 "data_size": 65536 00:15:27.600 } 00:15:27.600 ] 00:15:27.600 }' 00:15:27.600 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.600 07:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:28.171 [2024-07-15 07:51:12.806325] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:28.171 "name": "Existed_Raid", 00:15:28.171 "aliases": [ 00:15:28.171 "50626869-866d-419e-a28b-14b092a783f4" 00:15:28.171 ], 00:15:28.171 "product_name": "Raid Volume", 00:15:28.171 "block_size": 512, 00:15:28.171 "num_blocks": 65536, 00:15:28.171 "uuid": "50626869-866d-419e-a28b-14b092a783f4", 00:15:28.171 "assigned_rate_limits": { 00:15:28.171 "rw_ios_per_sec": 0, 00:15:28.171 "rw_mbytes_per_sec": 0, 00:15:28.171 "r_mbytes_per_sec": 0, 00:15:28.171 "w_mbytes_per_sec": 0 00:15:28.171 }, 00:15:28.171 "claimed": false, 00:15:28.171 "zoned": false, 00:15:28.171 "supported_io_types": { 00:15:28.171 "read": true, 00:15:28.171 "write": true, 00:15:28.171 "unmap": false, 00:15:28.171 "flush": false, 00:15:28.171 "reset": true, 00:15:28.171 "nvme_admin": false, 00:15:28.171 "nvme_io": false, 00:15:28.171 "nvme_io_md": false, 00:15:28.171 "write_zeroes": true, 00:15:28.171 "zcopy": false, 00:15:28.171 "get_zone_info": false, 00:15:28.171 "zone_management": false, 00:15:28.171 "zone_append": false, 00:15:28.171 "compare": false, 00:15:28.171 "compare_and_write": false, 00:15:28.171 "abort": false, 00:15:28.171 "seek_hole": false, 00:15:28.171 "seek_data": false, 00:15:28.171 "copy": false, 00:15:28.171 "nvme_iov_md": false 00:15:28.171 }, 00:15:28.171 "memory_domains": [ 00:15:28.171 { 00:15:28.171 "dma_device_id": "system", 00:15:28.171 "dma_device_type": 1 00:15:28.171 }, 00:15:28.171 { 00:15:28.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.171 "dma_device_type": 2 00:15:28.171 }, 00:15:28.171 { 00:15:28.171 "dma_device_id": "system", 00:15:28.171 "dma_device_type": 1 00:15:28.171 }, 00:15:28.171 { 00:15:28.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.171 "dma_device_type": 2 00:15:28.171 }, 00:15:28.171 { 00:15:28.171 "dma_device_id": "system", 00:15:28.171 "dma_device_type": 1 00:15:28.171 }, 00:15:28.171 { 00:15:28.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.171 "dma_device_type": 2 00:15:28.171 } 00:15:28.171 ], 00:15:28.171 "driver_specific": { 00:15:28.171 "raid": { 00:15:28.171 "uuid": "50626869-866d-419e-a28b-14b092a783f4", 00:15:28.171 "strip_size_kb": 0, 00:15:28.171 "state": "online", 00:15:28.171 "raid_level": "raid1", 00:15:28.171 "superblock": false, 00:15:28.171 "num_base_bdevs": 3, 00:15:28.171 "num_base_bdevs_discovered": 3, 00:15:28.171 "num_base_bdevs_operational": 3, 00:15:28.171 "base_bdevs_list": [ 00:15:28.171 { 00:15:28.171 "name": "NewBaseBdev", 00:15:28.171 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:28.171 "is_configured": true, 00:15:28.171 "data_offset": 0, 00:15:28.171 "data_size": 65536 00:15:28.171 }, 00:15:28.171 { 00:15:28.171 "name": "BaseBdev2", 00:15:28.171 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:28.171 "is_configured": true, 00:15:28.171 "data_offset": 0, 00:15:28.171 "data_size": 65536 00:15:28.171 }, 00:15:28.171 { 00:15:28.171 "name": "BaseBdev3", 00:15:28.171 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:28.171 "is_configured": true, 00:15:28.171 "data_offset": 0, 00:15:28.171 "data_size": 65536 00:15:28.171 } 00:15:28.171 ] 00:15:28.171 } 00:15:28.171 } 00:15:28.171 }' 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:28.171 BaseBdev2 00:15:28.171 BaseBdev3' 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:28.171 07:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:28.431 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:28.431 "name": "NewBaseBdev", 00:15:28.431 "aliases": [ 00:15:28.431 "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70" 00:15:28.431 ], 00:15:28.431 "product_name": "Malloc disk", 00:15:28.431 "block_size": 512, 00:15:28.431 "num_blocks": 65536, 00:15:28.431 "uuid": "2da2fd1e-072a-4d2e-9bb3-fca6c0a6df70", 00:15:28.431 "assigned_rate_limits": { 00:15:28.431 "rw_ios_per_sec": 0, 00:15:28.431 "rw_mbytes_per_sec": 0, 00:15:28.431 "r_mbytes_per_sec": 0, 00:15:28.431 "w_mbytes_per_sec": 0 00:15:28.431 }, 00:15:28.431 "claimed": true, 00:15:28.431 "claim_type": "exclusive_write", 00:15:28.431 "zoned": false, 00:15:28.431 "supported_io_types": { 00:15:28.431 "read": true, 00:15:28.431 "write": true, 00:15:28.431 "unmap": true, 00:15:28.431 "flush": true, 00:15:28.431 "reset": true, 00:15:28.431 "nvme_admin": false, 00:15:28.431 "nvme_io": false, 00:15:28.431 "nvme_io_md": false, 00:15:28.431 "write_zeroes": true, 00:15:28.431 "zcopy": true, 00:15:28.431 "get_zone_info": false, 00:15:28.431 "zone_management": false, 00:15:28.431 "zone_append": false, 00:15:28.431 "compare": false, 00:15:28.431 "compare_and_write": false, 00:15:28.431 "abort": true, 00:15:28.431 "seek_hole": false, 00:15:28.431 "seek_data": false, 00:15:28.431 "copy": true, 00:15:28.431 "nvme_iov_md": false 00:15:28.431 }, 00:15:28.431 "memory_domains": [ 00:15:28.431 { 00:15:28.431 "dma_device_id": "system", 00:15:28.431 "dma_device_type": 1 00:15:28.431 }, 00:15:28.431 { 00:15:28.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.431 "dma_device_type": 2 00:15:28.431 } 00:15:28.431 ], 00:15:28.431 "driver_specific": {} 00:15:28.431 }' 00:15:28.431 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.431 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.431 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:28.431 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:28.701 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:28.964 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:28.964 "name": "BaseBdev2", 00:15:28.964 "aliases": [ 00:15:28.964 "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b" 00:15:28.964 ], 00:15:28.964 "product_name": "Malloc disk", 00:15:28.964 "block_size": 512, 00:15:28.964 "num_blocks": 65536, 00:15:28.964 "uuid": "4f2a9b3c-f393-4f5e-9425-5faa7b26a14b", 00:15:28.964 "assigned_rate_limits": { 00:15:28.964 "rw_ios_per_sec": 0, 00:15:28.964 "rw_mbytes_per_sec": 0, 00:15:28.964 "r_mbytes_per_sec": 0, 00:15:28.964 "w_mbytes_per_sec": 0 00:15:28.964 }, 00:15:28.964 "claimed": true, 00:15:28.964 "claim_type": "exclusive_write", 00:15:28.964 "zoned": false, 00:15:28.964 "supported_io_types": { 00:15:28.964 "read": true, 00:15:28.964 "write": true, 00:15:28.964 "unmap": true, 00:15:28.964 "flush": true, 00:15:28.964 "reset": true, 00:15:28.964 "nvme_admin": false, 00:15:28.964 "nvme_io": false, 00:15:28.964 "nvme_io_md": false, 00:15:28.964 "write_zeroes": true, 00:15:28.964 "zcopy": true, 00:15:28.964 "get_zone_info": false, 00:15:28.964 "zone_management": false, 00:15:28.964 "zone_append": false, 00:15:28.964 "compare": false, 00:15:28.964 "compare_and_write": false, 00:15:28.964 "abort": true, 00:15:28.964 "seek_hole": false, 00:15:28.964 "seek_data": false, 00:15:28.964 "copy": true, 00:15:28.964 "nvme_iov_md": false 00:15:28.964 }, 00:15:28.964 "memory_domains": [ 00:15:28.964 { 00:15:28.964 "dma_device_id": "system", 00:15:28.964 "dma_device_type": 1 00:15:28.964 }, 00:15:28.964 { 00:15:28.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.964 "dma_device_type": 2 00:15:28.964 } 00:15:28.964 ], 00:15:28.964 "driver_specific": {} 00:15:28.964 }' 00:15:28.964 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.964 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:28.964 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:28.964 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:29.223 07:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:29.483 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:29.483 "name": "BaseBdev3", 00:15:29.483 "aliases": [ 00:15:29.483 "535a784a-8300-4614-b996-5b2c8a03bfd8" 00:15:29.483 ], 00:15:29.483 "product_name": "Malloc disk", 00:15:29.483 "block_size": 512, 00:15:29.483 "num_blocks": 65536, 00:15:29.483 "uuid": "535a784a-8300-4614-b996-5b2c8a03bfd8", 00:15:29.483 "assigned_rate_limits": { 00:15:29.483 "rw_ios_per_sec": 0, 00:15:29.483 "rw_mbytes_per_sec": 0, 00:15:29.483 "r_mbytes_per_sec": 0, 00:15:29.483 "w_mbytes_per_sec": 0 00:15:29.483 }, 00:15:29.483 "claimed": true, 00:15:29.483 "claim_type": "exclusive_write", 00:15:29.483 "zoned": false, 00:15:29.483 "supported_io_types": { 00:15:29.483 "read": true, 00:15:29.483 "write": true, 00:15:29.483 "unmap": true, 00:15:29.483 "flush": true, 00:15:29.483 "reset": true, 00:15:29.483 "nvme_admin": false, 00:15:29.483 "nvme_io": false, 00:15:29.483 "nvme_io_md": false, 00:15:29.483 "write_zeroes": true, 00:15:29.483 "zcopy": true, 00:15:29.483 "get_zone_info": false, 00:15:29.483 "zone_management": false, 00:15:29.483 "zone_append": false, 00:15:29.483 "compare": false, 00:15:29.483 "compare_and_write": false, 00:15:29.483 "abort": true, 00:15:29.483 "seek_hole": false, 00:15:29.483 "seek_data": false, 00:15:29.483 "copy": true, 00:15:29.483 "nvme_iov_md": false 00:15:29.483 }, 00:15:29.483 "memory_domains": [ 00:15:29.483 { 00:15:29.483 "dma_device_id": "system", 00:15:29.483 "dma_device_type": 1 00:15:29.483 }, 00:15:29.483 { 00:15:29.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.483 "dma_device_type": 2 00:15:29.483 } 00:15:29.483 ], 00:15:29.483 "driver_specific": {} 00:15:29.483 }' 00:15:29.483 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:29.483 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:29.743 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:29.744 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:30.004 [2024-07-15 07:51:14.654781] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:30.004 [2024-07-15 07:51:14.654796] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:30.004 [2024-07-15 07:51:14.654828] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:30.004 [2024-07-15 07:51:14.655029] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:30.004 [2024-07-15 07:51:14.655036] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b3a630 name Existed_Raid, state offline 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1638697 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1638697 ']' 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1638697 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1638697 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1638697' 00:15:30.004 killing process with pid 1638697 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1638697 00:15:30.004 [2024-07-15 07:51:14.722604] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:30.004 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1638697 00:15:30.004 [2024-07-15 07:51:14.737268] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:30.268 00:15:30.268 real 0m25.038s 00:15:30.268 user 0m46.900s 00:15:30.268 sys 0m3.711s 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.268 ************************************ 00:15:30.268 END TEST raid_state_function_test 00:15:30.268 ************************************ 00:15:30.268 07:51:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:30.268 07:51:14 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:15:30.268 07:51:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:30.268 07:51:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:30.268 07:51:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:30.268 ************************************ 00:15:30.268 START TEST raid_state_function_test_sb 00:15:30.268 ************************************ 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1644125 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1644125' 00:15:30.268 Process raid pid: 1644125 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1644125 /var/tmp/spdk-raid.sock 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1644125 ']' 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:30.268 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:30.268 07:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:30.268 [2024-07-15 07:51:14.997315] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:15:30.268 [2024-07-15 07:51:14.997361] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:30.545 [2024-07-15 07:51:15.086199] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:30.545 [2024-07-15 07:51:15.152552] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:30.545 [2024-07-15 07:51:15.195279] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:30.545 [2024-07-15 07:51:15.195302] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:31.113 07:51:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:31.113 07:51:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:31.113 07:51:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:31.373 [2024-07-15 07:51:16.014631] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:31.373 [2024-07-15 07:51:16.014660] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:31.373 [2024-07-15 07:51:16.014666] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:31.373 [2024-07-15 07:51:16.014672] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:31.373 [2024-07-15 07:51:16.014677] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:31.373 [2024-07-15 07:51:16.014682] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.373 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.633 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.633 "name": "Existed_Raid", 00:15:31.633 "uuid": "139c597a-1ca8-49c1-9b5e-3983189f04f2", 00:15:31.633 "strip_size_kb": 0, 00:15:31.633 "state": "configuring", 00:15:31.633 "raid_level": "raid1", 00:15:31.633 "superblock": true, 00:15:31.633 "num_base_bdevs": 3, 00:15:31.633 "num_base_bdevs_discovered": 0, 00:15:31.633 "num_base_bdevs_operational": 3, 00:15:31.633 "base_bdevs_list": [ 00:15:31.633 { 00:15:31.633 "name": "BaseBdev1", 00:15:31.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.633 "is_configured": false, 00:15:31.633 "data_offset": 0, 00:15:31.633 "data_size": 0 00:15:31.633 }, 00:15:31.633 { 00:15:31.633 "name": "BaseBdev2", 00:15:31.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.633 "is_configured": false, 00:15:31.633 "data_offset": 0, 00:15:31.633 "data_size": 0 00:15:31.633 }, 00:15:31.633 { 00:15:31.633 "name": "BaseBdev3", 00:15:31.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.633 "is_configured": false, 00:15:31.633 "data_offset": 0, 00:15:31.633 "data_size": 0 00:15:31.633 } 00:15:31.633 ] 00:15:31.633 }' 00:15:31.633 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.633 07:51:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:32.202 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:32.202 [2024-07-15 07:51:16.948885] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:32.202 [2024-07-15 07:51:16.948902] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23846d0 name Existed_Raid, state configuring 00:15:32.462 07:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:32.462 [2024-07-15 07:51:17.133366] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:32.462 [2024-07-15 07:51:17.133385] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:32.462 [2024-07-15 07:51:17.133390] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:32.462 [2024-07-15 07:51:17.133396] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:32.462 [2024-07-15 07:51:17.133400] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:32.462 [2024-07-15 07:51:17.133406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:32.463 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:32.722 [2024-07-15 07:51:17.324342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:32.722 BaseBdev1 00:15:32.722 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:32.722 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:32.722 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:32.722 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:32.722 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:32.722 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:32.722 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:32.981 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:32.981 [ 00:15:32.981 { 00:15:32.981 "name": "BaseBdev1", 00:15:32.981 "aliases": [ 00:15:32.981 "283f5c92-3256-4157-a726-062cf29aca67" 00:15:32.981 ], 00:15:32.981 "product_name": "Malloc disk", 00:15:32.981 "block_size": 512, 00:15:32.981 "num_blocks": 65536, 00:15:32.981 "uuid": "283f5c92-3256-4157-a726-062cf29aca67", 00:15:32.981 "assigned_rate_limits": { 00:15:32.981 "rw_ios_per_sec": 0, 00:15:32.981 "rw_mbytes_per_sec": 0, 00:15:32.981 "r_mbytes_per_sec": 0, 00:15:32.981 "w_mbytes_per_sec": 0 00:15:32.981 }, 00:15:32.981 "claimed": true, 00:15:32.981 "claim_type": "exclusive_write", 00:15:32.981 "zoned": false, 00:15:32.981 "supported_io_types": { 00:15:32.981 "read": true, 00:15:32.981 "write": true, 00:15:32.981 "unmap": true, 00:15:32.981 "flush": true, 00:15:32.981 "reset": true, 00:15:32.981 "nvme_admin": false, 00:15:32.981 "nvme_io": false, 00:15:32.981 "nvme_io_md": false, 00:15:32.981 "write_zeroes": true, 00:15:32.981 "zcopy": true, 00:15:32.981 "get_zone_info": false, 00:15:32.981 "zone_management": false, 00:15:32.981 "zone_append": false, 00:15:32.981 "compare": false, 00:15:32.981 "compare_and_write": false, 00:15:32.981 "abort": true, 00:15:32.981 "seek_hole": false, 00:15:32.981 "seek_data": false, 00:15:32.981 "copy": true, 00:15:32.981 "nvme_iov_md": false 00:15:32.981 }, 00:15:32.981 "memory_domains": [ 00:15:32.981 { 00:15:32.981 "dma_device_id": "system", 00:15:32.982 "dma_device_type": 1 00:15:32.982 }, 00:15:32.982 { 00:15:32.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.982 "dma_device_type": 2 00:15:32.982 } 00:15:32.982 ], 00:15:32.982 "driver_specific": {} 00:15:32.982 } 00:15:32.982 ] 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.982 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.243 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.243 "name": "Existed_Raid", 00:15:33.243 "uuid": "c45f573c-701c-4ceb-8986-117d5e2ba8e7", 00:15:33.243 "strip_size_kb": 0, 00:15:33.243 "state": "configuring", 00:15:33.243 "raid_level": "raid1", 00:15:33.243 "superblock": true, 00:15:33.243 "num_base_bdevs": 3, 00:15:33.243 "num_base_bdevs_discovered": 1, 00:15:33.243 "num_base_bdevs_operational": 3, 00:15:33.243 "base_bdevs_list": [ 00:15:33.243 { 00:15:33.243 "name": "BaseBdev1", 00:15:33.243 "uuid": "283f5c92-3256-4157-a726-062cf29aca67", 00:15:33.243 "is_configured": true, 00:15:33.243 "data_offset": 2048, 00:15:33.243 "data_size": 63488 00:15:33.243 }, 00:15:33.243 { 00:15:33.243 "name": "BaseBdev2", 00:15:33.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.243 "is_configured": false, 00:15:33.243 "data_offset": 0, 00:15:33.243 "data_size": 0 00:15:33.243 }, 00:15:33.243 { 00:15:33.243 "name": "BaseBdev3", 00:15:33.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.243 "is_configured": false, 00:15:33.243 "data_offset": 0, 00:15:33.243 "data_size": 0 00:15:33.243 } 00:15:33.243 ] 00:15:33.243 }' 00:15:33.243 07:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.243 07:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:33.812 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:34.071 [2024-07-15 07:51:18.635652] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:34.071 [2024-07-15 07:51:18.635675] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2383fa0 name Existed_Raid, state configuring 00:15:34.071 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:34.332 [2024-07-15 07:51:18.828168] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:34.332 [2024-07-15 07:51:18.829287] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:34.332 [2024-07-15 07:51:18.829309] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:34.332 [2024-07-15 07:51:18.829315] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:34.332 [2024-07-15 07:51:18.829320] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.332 07:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.332 07:51:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.332 "name": "Existed_Raid", 00:15:34.332 "uuid": "80f5b7ac-20dd-4dc9-8a0e-1f46ecc10aa0", 00:15:34.332 "strip_size_kb": 0, 00:15:34.332 "state": "configuring", 00:15:34.332 "raid_level": "raid1", 00:15:34.332 "superblock": true, 00:15:34.332 "num_base_bdevs": 3, 00:15:34.332 "num_base_bdevs_discovered": 1, 00:15:34.332 "num_base_bdevs_operational": 3, 00:15:34.332 "base_bdevs_list": [ 00:15:34.332 { 00:15:34.332 "name": "BaseBdev1", 00:15:34.332 "uuid": "283f5c92-3256-4157-a726-062cf29aca67", 00:15:34.332 "is_configured": true, 00:15:34.332 "data_offset": 2048, 00:15:34.332 "data_size": 63488 00:15:34.332 }, 00:15:34.332 { 00:15:34.332 "name": "BaseBdev2", 00:15:34.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.332 "is_configured": false, 00:15:34.332 "data_offset": 0, 00:15:34.332 "data_size": 0 00:15:34.332 }, 00:15:34.332 { 00:15:34.332 "name": "BaseBdev3", 00:15:34.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.332 "is_configured": false, 00:15:34.332 "data_offset": 0, 00:15:34.332 "data_size": 0 00:15:34.332 } 00:15:34.332 ] 00:15:34.332 }' 00:15:34.332 07:51:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.332 07:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:34.902 07:51:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:35.162 [2024-07-15 07:51:19.763485] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:35.163 BaseBdev2 00:15:35.163 07:51:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:35.163 07:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:35.163 07:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:35.163 07:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:35.163 07:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:35.163 07:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:35.163 07:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:35.423 07:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:35.423 [ 00:15:35.423 { 00:15:35.423 "name": "BaseBdev2", 00:15:35.423 "aliases": [ 00:15:35.423 "0cda6a7c-c57a-4fa8-a3ce-3936ed69751e" 00:15:35.423 ], 00:15:35.423 "product_name": "Malloc disk", 00:15:35.423 "block_size": 512, 00:15:35.423 "num_blocks": 65536, 00:15:35.423 "uuid": "0cda6a7c-c57a-4fa8-a3ce-3936ed69751e", 00:15:35.423 "assigned_rate_limits": { 00:15:35.423 "rw_ios_per_sec": 0, 00:15:35.423 "rw_mbytes_per_sec": 0, 00:15:35.423 "r_mbytes_per_sec": 0, 00:15:35.423 "w_mbytes_per_sec": 0 00:15:35.423 }, 00:15:35.423 "claimed": true, 00:15:35.423 "claim_type": "exclusive_write", 00:15:35.423 "zoned": false, 00:15:35.423 "supported_io_types": { 00:15:35.423 "read": true, 00:15:35.423 "write": true, 00:15:35.423 "unmap": true, 00:15:35.423 "flush": true, 00:15:35.423 "reset": true, 00:15:35.423 "nvme_admin": false, 00:15:35.423 "nvme_io": false, 00:15:35.423 "nvme_io_md": false, 00:15:35.423 "write_zeroes": true, 00:15:35.423 "zcopy": true, 00:15:35.423 "get_zone_info": false, 00:15:35.423 "zone_management": false, 00:15:35.423 "zone_append": false, 00:15:35.423 "compare": false, 00:15:35.424 "compare_and_write": false, 00:15:35.424 "abort": true, 00:15:35.424 "seek_hole": false, 00:15:35.424 "seek_data": false, 00:15:35.424 "copy": true, 00:15:35.424 "nvme_iov_md": false 00:15:35.424 }, 00:15:35.424 "memory_domains": [ 00:15:35.424 { 00:15:35.424 "dma_device_id": "system", 00:15:35.424 "dma_device_type": 1 00:15:35.424 }, 00:15:35.424 { 00:15:35.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.424 "dma_device_type": 2 00:15:35.424 } 00:15:35.424 ], 00:15:35.424 "driver_specific": {} 00:15:35.424 } 00:15:35.424 ] 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.424 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.684 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.684 "name": "Existed_Raid", 00:15:35.684 "uuid": "80f5b7ac-20dd-4dc9-8a0e-1f46ecc10aa0", 00:15:35.684 "strip_size_kb": 0, 00:15:35.684 "state": "configuring", 00:15:35.684 "raid_level": "raid1", 00:15:35.684 "superblock": true, 00:15:35.684 "num_base_bdevs": 3, 00:15:35.684 "num_base_bdevs_discovered": 2, 00:15:35.684 "num_base_bdevs_operational": 3, 00:15:35.684 "base_bdevs_list": [ 00:15:35.684 { 00:15:35.684 "name": "BaseBdev1", 00:15:35.684 "uuid": "283f5c92-3256-4157-a726-062cf29aca67", 00:15:35.684 "is_configured": true, 00:15:35.684 "data_offset": 2048, 00:15:35.684 "data_size": 63488 00:15:35.684 }, 00:15:35.684 { 00:15:35.684 "name": "BaseBdev2", 00:15:35.684 "uuid": "0cda6a7c-c57a-4fa8-a3ce-3936ed69751e", 00:15:35.684 "is_configured": true, 00:15:35.684 "data_offset": 2048, 00:15:35.684 "data_size": 63488 00:15:35.684 }, 00:15:35.684 { 00:15:35.684 "name": "BaseBdev3", 00:15:35.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.684 "is_configured": false, 00:15:35.684 "data_offset": 0, 00:15:35.684 "data_size": 0 00:15:35.684 } 00:15:35.684 ] 00:15:35.684 }' 00:15:35.684 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.684 07:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:36.254 07:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:36.515 [2024-07-15 07:51:21.015536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:36.515 [2024-07-15 07:51:21.015654] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2384e90 00:15:36.515 [2024-07-15 07:51:21.015663] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:36.515 [2024-07-15 07:51:21.015807] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2384b60 00:15:36.515 [2024-07-15 07:51:21.015904] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2384e90 00:15:36.515 [2024-07-15 07:51:21.015910] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2384e90 00:15:36.515 [2024-07-15 07:51:21.015976] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:36.515 BaseBdev3 00:15:36.515 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:36.515 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:36.515 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:36.515 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:36.515 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:36.515 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:36.515 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:36.515 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:36.775 [ 00:15:36.775 { 00:15:36.775 "name": "BaseBdev3", 00:15:36.775 "aliases": [ 00:15:36.775 "368030ae-91e8-434d-85f8-0b3e136976fd" 00:15:36.775 ], 00:15:36.775 "product_name": "Malloc disk", 00:15:36.775 "block_size": 512, 00:15:36.775 "num_blocks": 65536, 00:15:36.775 "uuid": "368030ae-91e8-434d-85f8-0b3e136976fd", 00:15:36.775 "assigned_rate_limits": { 00:15:36.775 "rw_ios_per_sec": 0, 00:15:36.775 "rw_mbytes_per_sec": 0, 00:15:36.775 "r_mbytes_per_sec": 0, 00:15:36.775 "w_mbytes_per_sec": 0 00:15:36.775 }, 00:15:36.775 "claimed": true, 00:15:36.775 "claim_type": "exclusive_write", 00:15:36.775 "zoned": false, 00:15:36.775 "supported_io_types": { 00:15:36.775 "read": true, 00:15:36.775 "write": true, 00:15:36.775 "unmap": true, 00:15:36.775 "flush": true, 00:15:36.775 "reset": true, 00:15:36.775 "nvme_admin": false, 00:15:36.775 "nvme_io": false, 00:15:36.775 "nvme_io_md": false, 00:15:36.775 "write_zeroes": true, 00:15:36.775 "zcopy": true, 00:15:36.775 "get_zone_info": false, 00:15:36.775 "zone_management": false, 00:15:36.775 "zone_append": false, 00:15:36.775 "compare": false, 00:15:36.775 "compare_and_write": false, 00:15:36.775 "abort": true, 00:15:36.775 "seek_hole": false, 00:15:36.775 "seek_data": false, 00:15:36.775 "copy": true, 00:15:36.775 "nvme_iov_md": false 00:15:36.775 }, 00:15:36.775 "memory_domains": [ 00:15:36.775 { 00:15:36.775 "dma_device_id": "system", 00:15:36.775 "dma_device_type": 1 00:15:36.775 }, 00:15:36.775 { 00:15:36.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.775 "dma_device_type": 2 00:15:36.775 } 00:15:36.775 ], 00:15:36.775 "driver_specific": {} 00:15:36.775 } 00:15:36.775 ] 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:36.775 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:36.776 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.776 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.776 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.776 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.776 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.776 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.036 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.036 "name": "Existed_Raid", 00:15:37.036 "uuid": "80f5b7ac-20dd-4dc9-8a0e-1f46ecc10aa0", 00:15:37.036 "strip_size_kb": 0, 00:15:37.036 "state": "online", 00:15:37.036 "raid_level": "raid1", 00:15:37.036 "superblock": true, 00:15:37.036 "num_base_bdevs": 3, 00:15:37.036 "num_base_bdevs_discovered": 3, 00:15:37.036 "num_base_bdevs_operational": 3, 00:15:37.036 "base_bdevs_list": [ 00:15:37.036 { 00:15:37.036 "name": "BaseBdev1", 00:15:37.036 "uuid": "283f5c92-3256-4157-a726-062cf29aca67", 00:15:37.036 "is_configured": true, 00:15:37.036 "data_offset": 2048, 00:15:37.036 "data_size": 63488 00:15:37.036 }, 00:15:37.036 { 00:15:37.036 "name": "BaseBdev2", 00:15:37.036 "uuid": "0cda6a7c-c57a-4fa8-a3ce-3936ed69751e", 00:15:37.036 "is_configured": true, 00:15:37.036 "data_offset": 2048, 00:15:37.036 "data_size": 63488 00:15:37.036 }, 00:15:37.036 { 00:15:37.036 "name": "BaseBdev3", 00:15:37.036 "uuid": "368030ae-91e8-434d-85f8-0b3e136976fd", 00:15:37.036 "is_configured": true, 00:15:37.036 "data_offset": 2048, 00:15:37.036 "data_size": 63488 00:15:37.036 } 00:15:37.036 ] 00:15:37.036 }' 00:15:37.036 07:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.036 07:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:37.977 [2024-07-15 07:51:22.668065] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:37.977 "name": "Existed_Raid", 00:15:37.977 "aliases": [ 00:15:37.977 "80f5b7ac-20dd-4dc9-8a0e-1f46ecc10aa0" 00:15:37.977 ], 00:15:37.977 "product_name": "Raid Volume", 00:15:37.977 "block_size": 512, 00:15:37.977 "num_blocks": 63488, 00:15:37.977 "uuid": "80f5b7ac-20dd-4dc9-8a0e-1f46ecc10aa0", 00:15:37.977 "assigned_rate_limits": { 00:15:37.977 "rw_ios_per_sec": 0, 00:15:37.977 "rw_mbytes_per_sec": 0, 00:15:37.977 "r_mbytes_per_sec": 0, 00:15:37.977 "w_mbytes_per_sec": 0 00:15:37.977 }, 00:15:37.977 "claimed": false, 00:15:37.977 "zoned": false, 00:15:37.977 "supported_io_types": { 00:15:37.977 "read": true, 00:15:37.977 "write": true, 00:15:37.977 "unmap": false, 00:15:37.977 "flush": false, 00:15:37.977 "reset": true, 00:15:37.977 "nvme_admin": false, 00:15:37.977 "nvme_io": false, 00:15:37.977 "nvme_io_md": false, 00:15:37.977 "write_zeroes": true, 00:15:37.977 "zcopy": false, 00:15:37.977 "get_zone_info": false, 00:15:37.977 "zone_management": false, 00:15:37.977 "zone_append": false, 00:15:37.977 "compare": false, 00:15:37.977 "compare_and_write": false, 00:15:37.977 "abort": false, 00:15:37.977 "seek_hole": false, 00:15:37.977 "seek_data": false, 00:15:37.977 "copy": false, 00:15:37.977 "nvme_iov_md": false 00:15:37.977 }, 00:15:37.977 "memory_domains": [ 00:15:37.977 { 00:15:37.977 "dma_device_id": "system", 00:15:37.977 "dma_device_type": 1 00:15:37.977 }, 00:15:37.977 { 00:15:37.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.977 "dma_device_type": 2 00:15:37.977 }, 00:15:37.977 { 00:15:37.977 "dma_device_id": "system", 00:15:37.977 "dma_device_type": 1 00:15:37.977 }, 00:15:37.977 { 00:15:37.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.977 "dma_device_type": 2 00:15:37.977 }, 00:15:37.977 { 00:15:37.977 "dma_device_id": "system", 00:15:37.977 "dma_device_type": 1 00:15:37.977 }, 00:15:37.977 { 00:15:37.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.977 "dma_device_type": 2 00:15:37.977 } 00:15:37.977 ], 00:15:37.977 "driver_specific": { 00:15:37.977 "raid": { 00:15:37.977 "uuid": "80f5b7ac-20dd-4dc9-8a0e-1f46ecc10aa0", 00:15:37.977 "strip_size_kb": 0, 00:15:37.977 "state": "online", 00:15:37.977 "raid_level": "raid1", 00:15:37.977 "superblock": true, 00:15:37.977 "num_base_bdevs": 3, 00:15:37.977 "num_base_bdevs_discovered": 3, 00:15:37.977 "num_base_bdevs_operational": 3, 00:15:37.977 "base_bdevs_list": [ 00:15:37.977 { 00:15:37.977 "name": "BaseBdev1", 00:15:37.977 "uuid": "283f5c92-3256-4157-a726-062cf29aca67", 00:15:37.977 "is_configured": true, 00:15:37.977 "data_offset": 2048, 00:15:37.977 "data_size": 63488 00:15:37.977 }, 00:15:37.977 { 00:15:37.977 "name": "BaseBdev2", 00:15:37.977 "uuid": "0cda6a7c-c57a-4fa8-a3ce-3936ed69751e", 00:15:37.977 "is_configured": true, 00:15:37.977 "data_offset": 2048, 00:15:37.977 "data_size": 63488 00:15:37.977 }, 00:15:37.977 { 00:15:37.977 "name": "BaseBdev3", 00:15:37.977 "uuid": "368030ae-91e8-434d-85f8-0b3e136976fd", 00:15:37.977 "is_configured": true, 00:15:37.977 "data_offset": 2048, 00:15:37.977 "data_size": 63488 00:15:37.977 } 00:15:37.977 ] 00:15:37.977 } 00:15:37.977 } 00:15:37.977 }' 00:15:37.977 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:38.237 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:38.237 BaseBdev2 00:15:38.237 BaseBdev3' 00:15:38.237 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.237 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:38.237 07:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.807 "name": "BaseBdev1", 00:15:38.807 "aliases": [ 00:15:38.807 "283f5c92-3256-4157-a726-062cf29aca67" 00:15:38.807 ], 00:15:38.807 "product_name": "Malloc disk", 00:15:38.807 "block_size": 512, 00:15:38.807 "num_blocks": 65536, 00:15:38.807 "uuid": "283f5c92-3256-4157-a726-062cf29aca67", 00:15:38.807 "assigned_rate_limits": { 00:15:38.807 "rw_ios_per_sec": 0, 00:15:38.807 "rw_mbytes_per_sec": 0, 00:15:38.807 "r_mbytes_per_sec": 0, 00:15:38.807 "w_mbytes_per_sec": 0 00:15:38.807 }, 00:15:38.807 "claimed": true, 00:15:38.807 "claim_type": "exclusive_write", 00:15:38.807 "zoned": false, 00:15:38.807 "supported_io_types": { 00:15:38.807 "read": true, 00:15:38.807 "write": true, 00:15:38.807 "unmap": true, 00:15:38.807 "flush": true, 00:15:38.807 "reset": true, 00:15:38.807 "nvme_admin": false, 00:15:38.807 "nvme_io": false, 00:15:38.807 "nvme_io_md": false, 00:15:38.807 "write_zeroes": true, 00:15:38.807 "zcopy": true, 00:15:38.807 "get_zone_info": false, 00:15:38.807 "zone_management": false, 00:15:38.807 "zone_append": false, 00:15:38.807 "compare": false, 00:15:38.807 "compare_and_write": false, 00:15:38.807 "abort": true, 00:15:38.807 "seek_hole": false, 00:15:38.807 "seek_data": false, 00:15:38.807 "copy": true, 00:15:38.807 "nvme_iov_md": false 00:15:38.807 }, 00:15:38.807 "memory_domains": [ 00:15:38.807 { 00:15:38.807 "dma_device_id": "system", 00:15:38.807 "dma_device_type": 1 00:15:38.807 }, 00:15:38.807 { 00:15:38.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.807 "dma_device_type": 2 00:15:38.807 } 00:15:38.807 ], 00:15:38.807 "driver_specific": {} 00:15:38.807 }' 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:38.807 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.066 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.066 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.066 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.066 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:39.066 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.324 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.324 "name": "BaseBdev2", 00:15:39.324 "aliases": [ 00:15:39.324 "0cda6a7c-c57a-4fa8-a3ce-3936ed69751e" 00:15:39.324 ], 00:15:39.324 "product_name": "Malloc disk", 00:15:39.324 "block_size": 512, 00:15:39.324 "num_blocks": 65536, 00:15:39.324 "uuid": "0cda6a7c-c57a-4fa8-a3ce-3936ed69751e", 00:15:39.324 "assigned_rate_limits": { 00:15:39.324 "rw_ios_per_sec": 0, 00:15:39.324 "rw_mbytes_per_sec": 0, 00:15:39.324 "r_mbytes_per_sec": 0, 00:15:39.324 "w_mbytes_per_sec": 0 00:15:39.324 }, 00:15:39.324 "claimed": true, 00:15:39.324 "claim_type": "exclusive_write", 00:15:39.324 "zoned": false, 00:15:39.324 "supported_io_types": { 00:15:39.324 "read": true, 00:15:39.324 "write": true, 00:15:39.324 "unmap": true, 00:15:39.324 "flush": true, 00:15:39.324 "reset": true, 00:15:39.324 "nvme_admin": false, 00:15:39.325 "nvme_io": false, 00:15:39.325 "nvme_io_md": false, 00:15:39.325 "write_zeroes": true, 00:15:39.325 "zcopy": true, 00:15:39.325 "get_zone_info": false, 00:15:39.325 "zone_management": false, 00:15:39.325 "zone_append": false, 00:15:39.325 "compare": false, 00:15:39.325 "compare_and_write": false, 00:15:39.325 "abort": true, 00:15:39.325 "seek_hole": false, 00:15:39.325 "seek_data": false, 00:15:39.325 "copy": true, 00:15:39.325 "nvme_iov_md": false 00:15:39.325 }, 00:15:39.325 "memory_domains": [ 00:15:39.325 { 00:15:39.325 "dma_device_id": "system", 00:15:39.325 "dma_device_type": 1 00:15:39.325 }, 00:15:39.325 { 00:15:39.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.325 "dma_device_type": 2 00:15:39.325 } 00:15:39.325 ], 00:15:39.325 "driver_specific": {} 00:15:39.325 }' 00:15:39.325 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.325 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.325 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.325 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.325 07:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.325 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.325 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.325 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.584 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:39.584 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.584 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:39.584 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:39.584 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.584 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:39.584 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.844 "name": "BaseBdev3", 00:15:39.844 "aliases": [ 00:15:39.844 "368030ae-91e8-434d-85f8-0b3e136976fd" 00:15:39.844 ], 00:15:39.844 "product_name": "Malloc disk", 00:15:39.844 "block_size": 512, 00:15:39.844 "num_blocks": 65536, 00:15:39.844 "uuid": "368030ae-91e8-434d-85f8-0b3e136976fd", 00:15:39.844 "assigned_rate_limits": { 00:15:39.844 "rw_ios_per_sec": 0, 00:15:39.844 "rw_mbytes_per_sec": 0, 00:15:39.844 "r_mbytes_per_sec": 0, 00:15:39.844 "w_mbytes_per_sec": 0 00:15:39.844 }, 00:15:39.844 "claimed": true, 00:15:39.844 "claim_type": "exclusive_write", 00:15:39.844 "zoned": false, 00:15:39.844 "supported_io_types": { 00:15:39.844 "read": true, 00:15:39.844 "write": true, 00:15:39.844 "unmap": true, 00:15:39.844 "flush": true, 00:15:39.844 "reset": true, 00:15:39.844 "nvme_admin": false, 00:15:39.844 "nvme_io": false, 00:15:39.844 "nvme_io_md": false, 00:15:39.844 "write_zeroes": true, 00:15:39.844 "zcopy": true, 00:15:39.844 "get_zone_info": false, 00:15:39.844 "zone_management": false, 00:15:39.844 "zone_append": false, 00:15:39.844 "compare": false, 00:15:39.844 "compare_and_write": false, 00:15:39.844 "abort": true, 00:15:39.844 "seek_hole": false, 00:15:39.844 "seek_data": false, 00:15:39.844 "copy": true, 00:15:39.844 "nvme_iov_md": false 00:15:39.844 }, 00:15:39.844 "memory_domains": [ 00:15:39.844 { 00:15:39.844 "dma_device_id": "system", 00:15:39.844 "dma_device_type": 1 00:15:39.844 }, 00:15:39.844 { 00:15:39.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.844 "dma_device_type": 2 00:15:39.844 } 00:15:39.844 ], 00:15:39.844 "driver_specific": {} 00:15:39.844 }' 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:39.844 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.104 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.104 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.104 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.104 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.104 07:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:40.674 [2024-07-15 07:51:25.222337] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.674 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:40.935 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:40.935 "name": "Existed_Raid", 00:15:40.935 "uuid": "80f5b7ac-20dd-4dc9-8a0e-1f46ecc10aa0", 00:15:40.935 "strip_size_kb": 0, 00:15:40.935 "state": "online", 00:15:40.935 "raid_level": "raid1", 00:15:40.935 "superblock": true, 00:15:40.935 "num_base_bdevs": 3, 00:15:40.935 "num_base_bdevs_discovered": 2, 00:15:40.935 "num_base_bdevs_operational": 2, 00:15:40.935 "base_bdevs_list": [ 00:15:40.935 { 00:15:40.935 "name": null, 00:15:40.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:40.935 "is_configured": false, 00:15:40.935 "data_offset": 2048, 00:15:40.935 "data_size": 63488 00:15:40.935 }, 00:15:40.935 { 00:15:40.935 "name": "BaseBdev2", 00:15:40.935 "uuid": "0cda6a7c-c57a-4fa8-a3ce-3936ed69751e", 00:15:40.935 "is_configured": true, 00:15:40.935 "data_offset": 2048, 00:15:40.935 "data_size": 63488 00:15:40.935 }, 00:15:40.935 { 00:15:40.935 "name": "BaseBdev3", 00:15:40.935 "uuid": "368030ae-91e8-434d-85f8-0b3e136976fd", 00:15:40.935 "is_configured": true, 00:15:40.935 "data_offset": 2048, 00:15:40.935 "data_size": 63488 00:15:40.935 } 00:15:40.935 ] 00:15:40.935 }' 00:15:40.935 07:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:40.935 07:51:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:41.504 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:41.504 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:41.504 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:41.504 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.504 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:41.504 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:41.504 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:42.073 [2024-07-15 07:51:26.710097] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:42.073 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:42.073 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:42.073 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.073 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:42.333 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:42.333 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:42.333 07:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:42.904 [2024-07-15 07:51:27.449799] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:42.904 [2024-07-15 07:51:27.449863] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:42.904 [2024-07-15 07:51:27.455783] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:42.904 [2024-07-15 07:51:27.455808] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:42.904 [2024-07-15 07:51:27.455814] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2384e90 name Existed_Raid, state offline 00:15:42.904 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:42.904 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:42.904 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.904 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:43.164 BaseBdev2 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:43.164 07:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:43.424 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:43.683 [ 00:15:43.683 { 00:15:43.683 "name": "BaseBdev2", 00:15:43.683 "aliases": [ 00:15:43.683 "9e1ea012-e648-49b5-8317-d41e114b117c" 00:15:43.683 ], 00:15:43.683 "product_name": "Malloc disk", 00:15:43.683 "block_size": 512, 00:15:43.683 "num_blocks": 65536, 00:15:43.683 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:43.683 "assigned_rate_limits": { 00:15:43.683 "rw_ios_per_sec": 0, 00:15:43.683 "rw_mbytes_per_sec": 0, 00:15:43.683 "r_mbytes_per_sec": 0, 00:15:43.683 "w_mbytes_per_sec": 0 00:15:43.683 }, 00:15:43.683 "claimed": false, 00:15:43.683 "zoned": false, 00:15:43.683 "supported_io_types": { 00:15:43.683 "read": true, 00:15:43.683 "write": true, 00:15:43.683 "unmap": true, 00:15:43.683 "flush": true, 00:15:43.683 "reset": true, 00:15:43.683 "nvme_admin": false, 00:15:43.683 "nvme_io": false, 00:15:43.683 "nvme_io_md": false, 00:15:43.683 "write_zeroes": true, 00:15:43.683 "zcopy": true, 00:15:43.683 "get_zone_info": false, 00:15:43.683 "zone_management": false, 00:15:43.683 "zone_append": false, 00:15:43.683 "compare": false, 00:15:43.683 "compare_and_write": false, 00:15:43.683 "abort": true, 00:15:43.683 "seek_hole": false, 00:15:43.683 "seek_data": false, 00:15:43.683 "copy": true, 00:15:43.683 "nvme_iov_md": false 00:15:43.683 }, 00:15:43.683 "memory_domains": [ 00:15:43.683 { 00:15:43.683 "dma_device_id": "system", 00:15:43.683 "dma_device_type": 1 00:15:43.683 }, 00:15:43.683 { 00:15:43.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.683 "dma_device_type": 2 00:15:43.683 } 00:15:43.683 ], 00:15:43.683 "driver_specific": {} 00:15:43.683 } 00:15:43.683 ] 00:15:43.683 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:43.684 BaseBdev3 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:43.684 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:43.943 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:44.204 [ 00:15:44.204 { 00:15:44.204 "name": "BaseBdev3", 00:15:44.204 "aliases": [ 00:15:44.204 "fd9aafc3-5311-4072-a1bc-317cceeb5b78" 00:15:44.204 ], 00:15:44.204 "product_name": "Malloc disk", 00:15:44.204 "block_size": 512, 00:15:44.204 "num_blocks": 65536, 00:15:44.204 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:44.204 "assigned_rate_limits": { 00:15:44.204 "rw_ios_per_sec": 0, 00:15:44.204 "rw_mbytes_per_sec": 0, 00:15:44.204 "r_mbytes_per_sec": 0, 00:15:44.204 "w_mbytes_per_sec": 0 00:15:44.204 }, 00:15:44.204 "claimed": false, 00:15:44.204 "zoned": false, 00:15:44.204 "supported_io_types": { 00:15:44.204 "read": true, 00:15:44.204 "write": true, 00:15:44.204 "unmap": true, 00:15:44.204 "flush": true, 00:15:44.204 "reset": true, 00:15:44.204 "nvme_admin": false, 00:15:44.204 "nvme_io": false, 00:15:44.204 "nvme_io_md": false, 00:15:44.204 "write_zeroes": true, 00:15:44.204 "zcopy": true, 00:15:44.204 "get_zone_info": false, 00:15:44.204 "zone_management": false, 00:15:44.204 "zone_append": false, 00:15:44.204 "compare": false, 00:15:44.204 "compare_and_write": false, 00:15:44.204 "abort": true, 00:15:44.204 "seek_hole": false, 00:15:44.204 "seek_data": false, 00:15:44.204 "copy": true, 00:15:44.204 "nvme_iov_md": false 00:15:44.204 }, 00:15:44.204 "memory_domains": [ 00:15:44.204 { 00:15:44.204 "dma_device_id": "system", 00:15:44.204 "dma_device_type": 1 00:15:44.204 }, 00:15:44.204 { 00:15:44.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.204 "dma_device_type": 2 00:15:44.204 } 00:15:44.204 ], 00:15:44.204 "driver_specific": {} 00:15:44.204 } 00:15:44.204 ] 00:15:44.204 07:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:44.204 07:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:44.204 07:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:44.204 07:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:44.773 [2024-07-15 07:51:29.314283] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:44.773 [2024-07-15 07:51:29.314315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:44.773 [2024-07-15 07:51:29.314329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:44.773 [2024-07-15 07:51:29.315361] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:44.773 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:44.773 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.773 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.773 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:44.774 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:44.774 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.774 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.774 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.774 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.774 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.774 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.774 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.033 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.033 "name": "Existed_Raid", 00:15:45.033 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:45.033 "strip_size_kb": 0, 00:15:45.033 "state": "configuring", 00:15:45.033 "raid_level": "raid1", 00:15:45.033 "superblock": true, 00:15:45.033 "num_base_bdevs": 3, 00:15:45.033 "num_base_bdevs_discovered": 2, 00:15:45.033 "num_base_bdevs_operational": 3, 00:15:45.033 "base_bdevs_list": [ 00:15:45.033 { 00:15:45.033 "name": "BaseBdev1", 00:15:45.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.033 "is_configured": false, 00:15:45.033 "data_offset": 0, 00:15:45.033 "data_size": 0 00:15:45.033 }, 00:15:45.033 { 00:15:45.033 "name": "BaseBdev2", 00:15:45.033 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:45.033 "is_configured": true, 00:15:45.033 "data_offset": 2048, 00:15:45.033 "data_size": 63488 00:15:45.033 }, 00:15:45.033 { 00:15:45.033 "name": "BaseBdev3", 00:15:45.033 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:45.033 "is_configured": true, 00:15:45.033 "data_offset": 2048, 00:15:45.033 "data_size": 63488 00:15:45.033 } 00:15:45.033 ] 00:15:45.033 }' 00:15:45.034 07:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.034 07:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:45.666 [2024-07-15 07:51:30.264677] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.666 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.926 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.926 "name": "Existed_Raid", 00:15:45.926 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:45.926 "strip_size_kb": 0, 00:15:45.926 "state": "configuring", 00:15:45.926 "raid_level": "raid1", 00:15:45.926 "superblock": true, 00:15:45.926 "num_base_bdevs": 3, 00:15:45.926 "num_base_bdevs_discovered": 1, 00:15:45.926 "num_base_bdevs_operational": 3, 00:15:45.926 "base_bdevs_list": [ 00:15:45.926 { 00:15:45.926 "name": "BaseBdev1", 00:15:45.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.926 "is_configured": false, 00:15:45.926 "data_offset": 0, 00:15:45.926 "data_size": 0 00:15:45.926 }, 00:15:45.926 { 00:15:45.926 "name": null, 00:15:45.926 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:45.926 "is_configured": false, 00:15:45.926 "data_offset": 2048, 00:15:45.926 "data_size": 63488 00:15:45.926 }, 00:15:45.926 { 00:15:45.926 "name": "BaseBdev3", 00:15:45.926 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:45.926 "is_configured": true, 00:15:45.926 "data_offset": 2048, 00:15:45.926 "data_size": 63488 00:15:45.926 } 00:15:45.926 ] 00:15:45.926 }' 00:15:45.926 07:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.926 07:51:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.496 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.496 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:46.496 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:46.496 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:46.756 [2024-07-15 07:51:31.368464] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:46.756 BaseBdev1 00:15:46.756 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:46.756 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:46.756 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:46.756 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:46.756 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:46.756 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:46.756 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.015 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:47.016 [ 00:15:47.016 { 00:15:47.016 "name": "BaseBdev1", 00:15:47.016 "aliases": [ 00:15:47.016 "7fbbaec8-dfed-4a47-8259-3243639b2051" 00:15:47.016 ], 00:15:47.016 "product_name": "Malloc disk", 00:15:47.016 "block_size": 512, 00:15:47.016 "num_blocks": 65536, 00:15:47.016 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:47.016 "assigned_rate_limits": { 00:15:47.016 "rw_ios_per_sec": 0, 00:15:47.016 "rw_mbytes_per_sec": 0, 00:15:47.016 "r_mbytes_per_sec": 0, 00:15:47.016 "w_mbytes_per_sec": 0 00:15:47.016 }, 00:15:47.016 "claimed": true, 00:15:47.016 "claim_type": "exclusive_write", 00:15:47.016 "zoned": false, 00:15:47.016 "supported_io_types": { 00:15:47.016 "read": true, 00:15:47.016 "write": true, 00:15:47.016 "unmap": true, 00:15:47.016 "flush": true, 00:15:47.016 "reset": true, 00:15:47.016 "nvme_admin": false, 00:15:47.016 "nvme_io": false, 00:15:47.016 "nvme_io_md": false, 00:15:47.016 "write_zeroes": true, 00:15:47.016 "zcopy": true, 00:15:47.016 "get_zone_info": false, 00:15:47.016 "zone_management": false, 00:15:47.016 "zone_append": false, 00:15:47.016 "compare": false, 00:15:47.016 "compare_and_write": false, 00:15:47.016 "abort": true, 00:15:47.016 "seek_hole": false, 00:15:47.016 "seek_data": false, 00:15:47.016 "copy": true, 00:15:47.016 "nvme_iov_md": false 00:15:47.016 }, 00:15:47.016 "memory_domains": [ 00:15:47.016 { 00:15:47.016 "dma_device_id": "system", 00:15:47.016 "dma_device_type": 1 00:15:47.016 }, 00:15:47.016 { 00:15:47.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.016 "dma_device_type": 2 00:15:47.016 } 00:15:47.016 ], 00:15:47.016 "driver_specific": {} 00:15:47.016 } 00:15:47.016 ] 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.016 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.277 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.277 "name": "Existed_Raid", 00:15:47.277 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:47.277 "strip_size_kb": 0, 00:15:47.277 "state": "configuring", 00:15:47.277 "raid_level": "raid1", 00:15:47.277 "superblock": true, 00:15:47.277 "num_base_bdevs": 3, 00:15:47.277 "num_base_bdevs_discovered": 2, 00:15:47.277 "num_base_bdevs_operational": 3, 00:15:47.277 "base_bdevs_list": [ 00:15:47.277 { 00:15:47.277 "name": "BaseBdev1", 00:15:47.277 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:47.277 "is_configured": true, 00:15:47.277 "data_offset": 2048, 00:15:47.277 "data_size": 63488 00:15:47.277 }, 00:15:47.277 { 00:15:47.277 "name": null, 00:15:47.277 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:47.277 "is_configured": false, 00:15:47.277 "data_offset": 2048, 00:15:47.277 "data_size": 63488 00:15:47.277 }, 00:15:47.277 { 00:15:47.277 "name": "BaseBdev3", 00:15:47.277 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:47.277 "is_configured": true, 00:15:47.277 "data_offset": 2048, 00:15:47.277 "data_size": 63488 00:15:47.277 } 00:15:47.277 ] 00:15:47.277 }' 00:15:47.277 07:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.277 07:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:47.848 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.848 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:48.108 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:48.108 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:48.108 [2024-07-15 07:51:32.852244] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.369 07:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.369 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.369 "name": "Existed_Raid", 00:15:48.369 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:48.369 "strip_size_kb": 0, 00:15:48.369 "state": "configuring", 00:15:48.369 "raid_level": "raid1", 00:15:48.369 "superblock": true, 00:15:48.369 "num_base_bdevs": 3, 00:15:48.369 "num_base_bdevs_discovered": 1, 00:15:48.369 "num_base_bdevs_operational": 3, 00:15:48.369 "base_bdevs_list": [ 00:15:48.369 { 00:15:48.369 "name": "BaseBdev1", 00:15:48.369 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:48.369 "is_configured": true, 00:15:48.369 "data_offset": 2048, 00:15:48.369 "data_size": 63488 00:15:48.369 }, 00:15:48.369 { 00:15:48.369 "name": null, 00:15:48.369 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:48.369 "is_configured": false, 00:15:48.369 "data_offset": 2048, 00:15:48.369 "data_size": 63488 00:15:48.369 }, 00:15:48.369 { 00:15:48.369 "name": null, 00:15:48.369 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:48.369 "is_configured": false, 00:15:48.369 "data_offset": 2048, 00:15:48.369 "data_size": 63488 00:15:48.369 } 00:15:48.369 ] 00:15:48.369 }' 00:15:48.369 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.369 07:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:48.939 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.939 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:49.198 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:49.198 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:49.198 [2024-07-15 07:51:33.947040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.458 07:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.458 07:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.458 "name": "Existed_Raid", 00:15:49.458 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:49.458 "strip_size_kb": 0, 00:15:49.458 "state": "configuring", 00:15:49.458 "raid_level": "raid1", 00:15:49.458 "superblock": true, 00:15:49.458 "num_base_bdevs": 3, 00:15:49.458 "num_base_bdevs_discovered": 2, 00:15:49.458 "num_base_bdevs_operational": 3, 00:15:49.458 "base_bdevs_list": [ 00:15:49.458 { 00:15:49.458 "name": "BaseBdev1", 00:15:49.458 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:49.458 "is_configured": true, 00:15:49.458 "data_offset": 2048, 00:15:49.458 "data_size": 63488 00:15:49.458 }, 00:15:49.458 { 00:15:49.458 "name": null, 00:15:49.458 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:49.458 "is_configured": false, 00:15:49.458 "data_offset": 2048, 00:15:49.458 "data_size": 63488 00:15:49.458 }, 00:15:49.458 { 00:15:49.458 "name": "BaseBdev3", 00:15:49.458 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:49.458 "is_configured": true, 00:15:49.458 "data_offset": 2048, 00:15:49.458 "data_size": 63488 00:15:49.458 } 00:15:49.458 ] 00:15:49.458 }' 00:15:49.458 07:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.458 07:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:50.029 07:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.029 07:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:50.289 07:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:50.289 07:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:50.549 [2024-07-15 07:51:35.061879] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.549 "name": "Existed_Raid", 00:15:50.549 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:50.549 "strip_size_kb": 0, 00:15:50.549 "state": "configuring", 00:15:50.549 "raid_level": "raid1", 00:15:50.549 "superblock": true, 00:15:50.549 "num_base_bdevs": 3, 00:15:50.549 "num_base_bdevs_discovered": 1, 00:15:50.549 "num_base_bdevs_operational": 3, 00:15:50.549 "base_bdevs_list": [ 00:15:50.549 { 00:15:50.549 "name": null, 00:15:50.549 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:50.549 "is_configured": false, 00:15:50.549 "data_offset": 2048, 00:15:50.549 "data_size": 63488 00:15:50.549 }, 00:15:50.549 { 00:15:50.549 "name": null, 00:15:50.549 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:50.549 "is_configured": false, 00:15:50.549 "data_offset": 2048, 00:15:50.549 "data_size": 63488 00:15:50.549 }, 00:15:50.549 { 00:15:50.549 "name": "BaseBdev3", 00:15:50.549 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:50.549 "is_configured": true, 00:15:50.549 "data_offset": 2048, 00:15:50.549 "data_size": 63488 00:15:50.549 } 00:15:50.549 ] 00:15:50.549 }' 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.549 07:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:51.118 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.118 07:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:51.378 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:51.378 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:51.639 [2024-07-15 07:51:36.194649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.639 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:51.899 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:51.899 "name": "Existed_Raid", 00:15:51.899 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:51.899 "strip_size_kb": 0, 00:15:51.899 "state": "configuring", 00:15:51.899 "raid_level": "raid1", 00:15:51.899 "superblock": true, 00:15:51.899 "num_base_bdevs": 3, 00:15:51.899 "num_base_bdevs_discovered": 2, 00:15:51.899 "num_base_bdevs_operational": 3, 00:15:51.899 "base_bdevs_list": [ 00:15:51.899 { 00:15:51.899 "name": null, 00:15:51.899 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:51.899 "is_configured": false, 00:15:51.899 "data_offset": 2048, 00:15:51.900 "data_size": 63488 00:15:51.900 }, 00:15:51.900 { 00:15:51.900 "name": "BaseBdev2", 00:15:51.900 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:51.900 "is_configured": true, 00:15:51.900 "data_offset": 2048, 00:15:51.900 "data_size": 63488 00:15:51.900 }, 00:15:51.900 { 00:15:51.900 "name": "BaseBdev3", 00:15:51.900 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:51.900 "is_configured": true, 00:15:51.900 "data_offset": 2048, 00:15:51.900 "data_size": 63488 00:15:51.900 } 00:15:51.900 ] 00:15:51.900 }' 00:15:51.900 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:51.900 07:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:52.470 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.470 07:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:52.470 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:52.470 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.470 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:52.729 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7fbbaec8-dfed-4a47-8259-3243639b2051 00:15:52.989 [2024-07-15 07:51:37.539013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:52.989 [2024-07-15 07:51:37.539121] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2528890 00:15:52.989 [2024-07-15 07:51:37.539129] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:52.989 [2024-07-15 07:51:37.539265] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23846a0 00:15:52.989 [2024-07-15 07:51:37.539355] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2528890 00:15:52.989 [2024-07-15 07:51:37.539360] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2528890 00:15:52.989 [2024-07-15 07:51:37.539427] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:52.989 NewBaseBdev 00:15:52.989 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:52.989 07:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:52.989 07:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:52.989 07:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:52.989 07:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:52.989 07:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:52.990 07:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:52.990 07:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:53.251 [ 00:15:53.251 { 00:15:53.251 "name": "NewBaseBdev", 00:15:53.251 "aliases": [ 00:15:53.251 "7fbbaec8-dfed-4a47-8259-3243639b2051" 00:15:53.251 ], 00:15:53.251 "product_name": "Malloc disk", 00:15:53.251 "block_size": 512, 00:15:53.251 "num_blocks": 65536, 00:15:53.251 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:53.251 "assigned_rate_limits": { 00:15:53.251 "rw_ios_per_sec": 0, 00:15:53.251 "rw_mbytes_per_sec": 0, 00:15:53.251 "r_mbytes_per_sec": 0, 00:15:53.251 "w_mbytes_per_sec": 0 00:15:53.251 }, 00:15:53.251 "claimed": true, 00:15:53.251 "claim_type": "exclusive_write", 00:15:53.251 "zoned": false, 00:15:53.251 "supported_io_types": { 00:15:53.251 "read": true, 00:15:53.251 "write": true, 00:15:53.251 "unmap": true, 00:15:53.251 "flush": true, 00:15:53.251 "reset": true, 00:15:53.251 "nvme_admin": false, 00:15:53.251 "nvme_io": false, 00:15:53.251 "nvme_io_md": false, 00:15:53.251 "write_zeroes": true, 00:15:53.251 "zcopy": true, 00:15:53.251 "get_zone_info": false, 00:15:53.251 "zone_management": false, 00:15:53.251 "zone_append": false, 00:15:53.251 "compare": false, 00:15:53.251 "compare_and_write": false, 00:15:53.251 "abort": true, 00:15:53.251 "seek_hole": false, 00:15:53.251 "seek_data": false, 00:15:53.251 "copy": true, 00:15:53.251 "nvme_iov_md": false 00:15:53.251 }, 00:15:53.251 "memory_domains": [ 00:15:53.251 { 00:15:53.251 "dma_device_id": "system", 00:15:53.251 "dma_device_type": 1 00:15:53.251 }, 00:15:53.251 { 00:15:53.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.251 "dma_device_type": 2 00:15:53.251 } 00:15:53.251 ], 00:15:53.251 "driver_specific": {} 00:15:53.251 } 00:15:53.251 ] 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.251 07:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:53.512 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:53.512 "name": "Existed_Raid", 00:15:53.512 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:53.512 "strip_size_kb": 0, 00:15:53.512 "state": "online", 00:15:53.512 "raid_level": "raid1", 00:15:53.512 "superblock": true, 00:15:53.512 "num_base_bdevs": 3, 00:15:53.512 "num_base_bdevs_discovered": 3, 00:15:53.512 "num_base_bdevs_operational": 3, 00:15:53.512 "base_bdevs_list": [ 00:15:53.512 { 00:15:53.512 "name": "NewBaseBdev", 00:15:53.512 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:53.512 "is_configured": true, 00:15:53.512 "data_offset": 2048, 00:15:53.512 "data_size": 63488 00:15:53.512 }, 00:15:53.512 { 00:15:53.512 "name": "BaseBdev2", 00:15:53.512 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:53.512 "is_configured": true, 00:15:53.512 "data_offset": 2048, 00:15:53.512 "data_size": 63488 00:15:53.512 }, 00:15:53.512 { 00:15:53.512 "name": "BaseBdev3", 00:15:53.512 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:53.512 "is_configured": true, 00:15:53.512 "data_offset": 2048, 00:15:53.512 "data_size": 63488 00:15:53.512 } 00:15:53.512 ] 00:15:53.512 }' 00:15:53.512 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:53.512 07:51:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:54.083 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:54.083 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:54.083 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:54.083 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:54.083 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:54.083 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:54.083 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:54.083 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:54.083 [2024-07-15 07:51:38.830519] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:54.343 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:54.343 "name": "Existed_Raid", 00:15:54.343 "aliases": [ 00:15:54.343 "3e3bda3f-0fbb-43ef-901c-72f65ea0600e" 00:15:54.343 ], 00:15:54.343 "product_name": "Raid Volume", 00:15:54.343 "block_size": 512, 00:15:54.343 "num_blocks": 63488, 00:15:54.343 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:54.343 "assigned_rate_limits": { 00:15:54.343 "rw_ios_per_sec": 0, 00:15:54.343 "rw_mbytes_per_sec": 0, 00:15:54.343 "r_mbytes_per_sec": 0, 00:15:54.343 "w_mbytes_per_sec": 0 00:15:54.343 }, 00:15:54.343 "claimed": false, 00:15:54.343 "zoned": false, 00:15:54.343 "supported_io_types": { 00:15:54.343 "read": true, 00:15:54.343 "write": true, 00:15:54.343 "unmap": false, 00:15:54.343 "flush": false, 00:15:54.343 "reset": true, 00:15:54.343 "nvme_admin": false, 00:15:54.343 "nvme_io": false, 00:15:54.343 "nvme_io_md": false, 00:15:54.343 "write_zeroes": true, 00:15:54.343 "zcopy": false, 00:15:54.343 "get_zone_info": false, 00:15:54.343 "zone_management": false, 00:15:54.343 "zone_append": false, 00:15:54.343 "compare": false, 00:15:54.343 "compare_and_write": false, 00:15:54.343 "abort": false, 00:15:54.343 "seek_hole": false, 00:15:54.343 "seek_data": false, 00:15:54.343 "copy": false, 00:15:54.343 "nvme_iov_md": false 00:15:54.343 }, 00:15:54.343 "memory_domains": [ 00:15:54.343 { 00:15:54.343 "dma_device_id": "system", 00:15:54.343 "dma_device_type": 1 00:15:54.343 }, 00:15:54.343 { 00:15:54.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.343 "dma_device_type": 2 00:15:54.343 }, 00:15:54.343 { 00:15:54.343 "dma_device_id": "system", 00:15:54.343 "dma_device_type": 1 00:15:54.343 }, 00:15:54.343 { 00:15:54.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.343 "dma_device_type": 2 00:15:54.343 }, 00:15:54.343 { 00:15:54.343 "dma_device_id": "system", 00:15:54.343 "dma_device_type": 1 00:15:54.343 }, 00:15:54.343 { 00:15:54.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.343 "dma_device_type": 2 00:15:54.343 } 00:15:54.343 ], 00:15:54.343 "driver_specific": { 00:15:54.343 "raid": { 00:15:54.343 "uuid": "3e3bda3f-0fbb-43ef-901c-72f65ea0600e", 00:15:54.343 "strip_size_kb": 0, 00:15:54.343 "state": "online", 00:15:54.343 "raid_level": "raid1", 00:15:54.343 "superblock": true, 00:15:54.343 "num_base_bdevs": 3, 00:15:54.343 "num_base_bdevs_discovered": 3, 00:15:54.343 "num_base_bdevs_operational": 3, 00:15:54.343 "base_bdevs_list": [ 00:15:54.343 { 00:15:54.343 "name": "NewBaseBdev", 00:15:54.343 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:54.343 "is_configured": true, 00:15:54.343 "data_offset": 2048, 00:15:54.343 "data_size": 63488 00:15:54.344 }, 00:15:54.344 { 00:15:54.344 "name": "BaseBdev2", 00:15:54.344 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:54.344 "is_configured": true, 00:15:54.344 "data_offset": 2048, 00:15:54.344 "data_size": 63488 00:15:54.344 }, 00:15:54.344 { 00:15:54.344 "name": "BaseBdev3", 00:15:54.344 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:54.344 "is_configured": true, 00:15:54.344 "data_offset": 2048, 00:15:54.344 "data_size": 63488 00:15:54.344 } 00:15:54.344 ] 00:15:54.344 } 00:15:54.344 } 00:15:54.344 }' 00:15:54.344 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:54.344 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:54.344 BaseBdev2 00:15:54.344 BaseBdev3' 00:15:54.344 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:54.344 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:54.344 07:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:54.344 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:54.344 "name": "NewBaseBdev", 00:15:54.344 "aliases": [ 00:15:54.344 "7fbbaec8-dfed-4a47-8259-3243639b2051" 00:15:54.344 ], 00:15:54.344 "product_name": "Malloc disk", 00:15:54.344 "block_size": 512, 00:15:54.344 "num_blocks": 65536, 00:15:54.344 "uuid": "7fbbaec8-dfed-4a47-8259-3243639b2051", 00:15:54.344 "assigned_rate_limits": { 00:15:54.344 "rw_ios_per_sec": 0, 00:15:54.344 "rw_mbytes_per_sec": 0, 00:15:54.344 "r_mbytes_per_sec": 0, 00:15:54.344 "w_mbytes_per_sec": 0 00:15:54.344 }, 00:15:54.344 "claimed": true, 00:15:54.344 "claim_type": "exclusive_write", 00:15:54.344 "zoned": false, 00:15:54.344 "supported_io_types": { 00:15:54.344 "read": true, 00:15:54.344 "write": true, 00:15:54.344 "unmap": true, 00:15:54.344 "flush": true, 00:15:54.344 "reset": true, 00:15:54.344 "nvme_admin": false, 00:15:54.344 "nvme_io": false, 00:15:54.344 "nvme_io_md": false, 00:15:54.344 "write_zeroes": true, 00:15:54.344 "zcopy": true, 00:15:54.344 "get_zone_info": false, 00:15:54.344 "zone_management": false, 00:15:54.344 "zone_append": false, 00:15:54.344 "compare": false, 00:15:54.344 "compare_and_write": false, 00:15:54.344 "abort": true, 00:15:54.344 "seek_hole": false, 00:15:54.344 "seek_data": false, 00:15:54.344 "copy": true, 00:15:54.344 "nvme_iov_md": false 00:15:54.344 }, 00:15:54.344 "memory_domains": [ 00:15:54.344 { 00:15:54.344 "dma_device_id": "system", 00:15:54.344 "dma_device_type": 1 00:15:54.344 }, 00:15:54.344 { 00:15:54.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.344 "dma_device_type": 2 00:15:54.344 } 00:15:54.344 ], 00:15:54.344 "driver_specific": {} 00:15:54.344 }' 00:15:54.344 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.604 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.865 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.865 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.865 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:54.865 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:54.865 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:55.125 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:55.125 "name": "BaseBdev2", 00:15:55.125 "aliases": [ 00:15:55.125 "9e1ea012-e648-49b5-8317-d41e114b117c" 00:15:55.125 ], 00:15:55.125 "product_name": "Malloc disk", 00:15:55.125 "block_size": 512, 00:15:55.125 "num_blocks": 65536, 00:15:55.125 "uuid": "9e1ea012-e648-49b5-8317-d41e114b117c", 00:15:55.125 "assigned_rate_limits": { 00:15:55.125 "rw_ios_per_sec": 0, 00:15:55.125 "rw_mbytes_per_sec": 0, 00:15:55.125 "r_mbytes_per_sec": 0, 00:15:55.125 "w_mbytes_per_sec": 0 00:15:55.125 }, 00:15:55.125 "claimed": true, 00:15:55.125 "claim_type": "exclusive_write", 00:15:55.125 "zoned": false, 00:15:55.125 "supported_io_types": { 00:15:55.125 "read": true, 00:15:55.125 "write": true, 00:15:55.125 "unmap": true, 00:15:55.125 "flush": true, 00:15:55.125 "reset": true, 00:15:55.125 "nvme_admin": false, 00:15:55.125 "nvme_io": false, 00:15:55.125 "nvme_io_md": false, 00:15:55.125 "write_zeroes": true, 00:15:55.125 "zcopy": true, 00:15:55.125 "get_zone_info": false, 00:15:55.125 "zone_management": false, 00:15:55.125 "zone_append": false, 00:15:55.125 "compare": false, 00:15:55.125 "compare_and_write": false, 00:15:55.125 "abort": true, 00:15:55.125 "seek_hole": false, 00:15:55.125 "seek_data": false, 00:15:55.125 "copy": true, 00:15:55.125 "nvme_iov_md": false 00:15:55.125 }, 00:15:55.125 "memory_domains": [ 00:15:55.125 { 00:15:55.125 "dma_device_id": "system", 00:15:55.125 "dma_device_type": 1 00:15:55.125 }, 00:15:55.125 { 00:15:55.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.125 "dma_device_type": 2 00:15:55.125 } 00:15:55.125 ], 00:15:55.125 "driver_specific": {} 00:15:55.125 }' 00:15:55.125 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:55.125 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:55.125 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:55.125 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:55.125 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:55.125 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:55.125 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:55.385 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:55.385 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:55.385 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:55.385 07:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:55.385 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:55.385 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:55.385 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:55.385 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:55.645 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:55.645 "name": "BaseBdev3", 00:15:55.645 "aliases": [ 00:15:55.645 "fd9aafc3-5311-4072-a1bc-317cceeb5b78" 00:15:55.645 ], 00:15:55.645 "product_name": "Malloc disk", 00:15:55.645 "block_size": 512, 00:15:55.645 "num_blocks": 65536, 00:15:55.645 "uuid": "fd9aafc3-5311-4072-a1bc-317cceeb5b78", 00:15:55.645 "assigned_rate_limits": { 00:15:55.645 "rw_ios_per_sec": 0, 00:15:55.645 "rw_mbytes_per_sec": 0, 00:15:55.645 "r_mbytes_per_sec": 0, 00:15:55.645 "w_mbytes_per_sec": 0 00:15:55.645 }, 00:15:55.645 "claimed": true, 00:15:55.645 "claim_type": "exclusive_write", 00:15:55.645 "zoned": false, 00:15:55.645 "supported_io_types": { 00:15:55.645 "read": true, 00:15:55.645 "write": true, 00:15:55.645 "unmap": true, 00:15:55.645 "flush": true, 00:15:55.645 "reset": true, 00:15:55.645 "nvme_admin": false, 00:15:55.645 "nvme_io": false, 00:15:55.645 "nvme_io_md": false, 00:15:55.645 "write_zeroes": true, 00:15:55.645 "zcopy": true, 00:15:55.645 "get_zone_info": false, 00:15:55.645 "zone_management": false, 00:15:55.645 "zone_append": false, 00:15:55.645 "compare": false, 00:15:55.645 "compare_and_write": false, 00:15:55.645 "abort": true, 00:15:55.645 "seek_hole": false, 00:15:55.645 "seek_data": false, 00:15:55.645 "copy": true, 00:15:55.645 "nvme_iov_md": false 00:15:55.645 }, 00:15:55.645 "memory_domains": [ 00:15:55.645 { 00:15:55.645 "dma_device_id": "system", 00:15:55.645 "dma_device_type": 1 00:15:55.645 }, 00:15:55.645 { 00:15:55.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.645 "dma_device_type": 2 00:15:55.645 } 00:15:55.645 ], 00:15:55.645 "driver_specific": {} 00:15:55.645 }' 00:15:55.645 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:55.645 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:55.645 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:55.645 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:55.645 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:55.645 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:55.645 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:55.905 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:55.905 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:55.905 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:55.905 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:55.905 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:55.905 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:56.165 [2024-07-15 07:51:40.747156] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:56.165 [2024-07-15 07:51:40.747173] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:56.165 [2024-07-15 07:51:40.747206] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:56.165 [2024-07-15 07:51:40.747407] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:56.165 [2024-07-15 07:51:40.747414] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2528890 name Existed_Raid, state offline 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1644125 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1644125 ']' 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1644125 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1644125 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1644125' 00:15:56.165 killing process with pid 1644125 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1644125 00:15:56.165 [2024-07-15 07:51:40.817838] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:56.165 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1644125 00:15:56.165 [2024-07-15 07:51:40.832432] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:56.426 07:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:56.426 00:15:56.426 real 0m26.013s 00:15:56.426 user 0m48.830s 00:15:56.426 sys 0m3.686s 00:15:56.426 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:56.426 07:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.426 ************************************ 00:15:56.426 END TEST raid_state_function_test_sb 00:15:56.426 ************************************ 00:15:56.426 07:51:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:56.426 07:51:40 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:15:56.426 07:51:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:56.426 07:51:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:56.426 07:51:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:56.426 ************************************ 00:15:56.426 START TEST raid_superblock_test 00:15:56.426 ************************************ 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1649014 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1649014 /var/tmp/spdk-raid.sock 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1649014 ']' 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:56.426 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:56.426 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.426 [2024-07-15 07:51:41.094225] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:15:56.426 [2024-07-15 07:51:41.094284] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1649014 ] 00:15:56.686 [2024-07-15 07:51:41.182825] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:56.686 [2024-07-15 07:51:41.244671] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:56.686 [2024-07-15 07:51:41.285095] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:56.686 [2024-07-15 07:51:41.285117] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:57.257 07:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:57.517 malloc1 00:15:57.517 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:57.777 [2024-07-15 07:51:42.291428] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:57.777 [2024-07-15 07:51:42.291461] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:57.777 [2024-07-15 07:51:42.291472] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db9a20 00:15:57.777 [2024-07-15 07:51:42.291478] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:57.777 [2024-07-15 07:51:42.292789] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:57.777 [2024-07-15 07:51:42.292808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:57.777 pt1 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:57.777 malloc2 00:15:57.777 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:58.037 [2024-07-15 07:51:42.678438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:58.037 [2024-07-15 07:51:42.678465] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:58.037 [2024-07-15 07:51:42.678477] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dba040 00:15:58.037 [2024-07-15 07:51:42.678484] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:58.037 [2024-07-15 07:51:42.679668] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:58.037 [2024-07-15 07:51:42.679686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:58.037 pt2 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:58.037 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:58.296 malloc3 00:15:58.296 07:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:58.556 [2024-07-15 07:51:43.065374] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:58.556 [2024-07-15 07:51:43.065403] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:58.556 [2024-07-15 07:51:43.065416] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dba540 00:15:58.556 [2024-07-15 07:51:43.065422] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:58.556 [2024-07-15 07:51:43.066604] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:58.556 [2024-07-15 07:51:43.066624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:58.556 pt3 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:58.556 [2024-07-15 07:51:43.257865] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:58.556 [2024-07-15 07:51:43.258838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:58.556 [2024-07-15 07:51:43.258878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:58.556 [2024-07-15 07:51:43.258992] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f66a90 00:15:58.556 [2024-07-15 07:51:43.258999] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:58.556 [2024-07-15 07:51:43.259144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f62c50 00:15:58.556 [2024-07-15 07:51:43.259254] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f66a90 00:15:58.556 [2024-07-15 07:51:43.259259] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f66a90 00:15:58.556 [2024-07-15 07:51:43.259326] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.556 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:58.816 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.816 "name": "raid_bdev1", 00:15:58.816 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:15:58.816 "strip_size_kb": 0, 00:15:58.816 "state": "online", 00:15:58.816 "raid_level": "raid1", 00:15:58.816 "superblock": true, 00:15:58.816 "num_base_bdevs": 3, 00:15:58.816 "num_base_bdevs_discovered": 3, 00:15:58.816 "num_base_bdevs_operational": 3, 00:15:58.816 "base_bdevs_list": [ 00:15:58.816 { 00:15:58.816 "name": "pt1", 00:15:58.816 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:58.816 "is_configured": true, 00:15:58.816 "data_offset": 2048, 00:15:58.816 "data_size": 63488 00:15:58.816 }, 00:15:58.816 { 00:15:58.816 "name": "pt2", 00:15:58.816 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:58.816 "is_configured": true, 00:15:58.816 "data_offset": 2048, 00:15:58.816 "data_size": 63488 00:15:58.816 }, 00:15:58.816 { 00:15:58.816 "name": "pt3", 00:15:58.816 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:58.816 "is_configured": true, 00:15:58.816 "data_offset": 2048, 00:15:58.816 "data_size": 63488 00:15:58.816 } 00:15:58.816 ] 00:15:58.816 }' 00:15:58.816 07:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.816 07:51:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.385 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:59.385 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:59.385 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:59.385 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:59.385 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:59.385 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:59.385 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:59.385 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:59.663 [2024-07-15 07:51:44.208477] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:59.663 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:59.663 "name": "raid_bdev1", 00:15:59.663 "aliases": [ 00:15:59.663 "82675149-81d9-4128-866f-38a7bc166790" 00:15:59.663 ], 00:15:59.663 "product_name": "Raid Volume", 00:15:59.663 "block_size": 512, 00:15:59.663 "num_blocks": 63488, 00:15:59.663 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:15:59.663 "assigned_rate_limits": { 00:15:59.663 "rw_ios_per_sec": 0, 00:15:59.663 "rw_mbytes_per_sec": 0, 00:15:59.663 "r_mbytes_per_sec": 0, 00:15:59.663 "w_mbytes_per_sec": 0 00:15:59.663 }, 00:15:59.663 "claimed": false, 00:15:59.663 "zoned": false, 00:15:59.663 "supported_io_types": { 00:15:59.663 "read": true, 00:15:59.663 "write": true, 00:15:59.663 "unmap": false, 00:15:59.663 "flush": false, 00:15:59.663 "reset": true, 00:15:59.663 "nvme_admin": false, 00:15:59.663 "nvme_io": false, 00:15:59.663 "nvme_io_md": false, 00:15:59.663 "write_zeroes": true, 00:15:59.663 "zcopy": false, 00:15:59.663 "get_zone_info": false, 00:15:59.663 "zone_management": false, 00:15:59.663 "zone_append": false, 00:15:59.663 "compare": false, 00:15:59.663 "compare_and_write": false, 00:15:59.663 "abort": false, 00:15:59.663 "seek_hole": false, 00:15:59.663 "seek_data": false, 00:15:59.663 "copy": false, 00:15:59.663 "nvme_iov_md": false 00:15:59.663 }, 00:15:59.663 "memory_domains": [ 00:15:59.663 { 00:15:59.663 "dma_device_id": "system", 00:15:59.663 "dma_device_type": 1 00:15:59.663 }, 00:15:59.663 { 00:15:59.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.663 "dma_device_type": 2 00:15:59.663 }, 00:15:59.663 { 00:15:59.663 "dma_device_id": "system", 00:15:59.663 "dma_device_type": 1 00:15:59.663 }, 00:15:59.663 { 00:15:59.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.663 "dma_device_type": 2 00:15:59.663 }, 00:15:59.663 { 00:15:59.663 "dma_device_id": "system", 00:15:59.663 "dma_device_type": 1 00:15:59.663 }, 00:15:59.663 { 00:15:59.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.663 "dma_device_type": 2 00:15:59.663 } 00:15:59.663 ], 00:15:59.663 "driver_specific": { 00:15:59.663 "raid": { 00:15:59.663 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:15:59.663 "strip_size_kb": 0, 00:15:59.663 "state": "online", 00:15:59.663 "raid_level": "raid1", 00:15:59.663 "superblock": true, 00:15:59.663 "num_base_bdevs": 3, 00:15:59.663 "num_base_bdevs_discovered": 3, 00:15:59.663 "num_base_bdevs_operational": 3, 00:15:59.663 "base_bdevs_list": [ 00:15:59.663 { 00:15:59.663 "name": "pt1", 00:15:59.663 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:59.663 "is_configured": true, 00:15:59.663 "data_offset": 2048, 00:15:59.663 "data_size": 63488 00:15:59.663 }, 00:15:59.663 { 00:15:59.663 "name": "pt2", 00:15:59.663 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:59.663 "is_configured": true, 00:15:59.663 "data_offset": 2048, 00:15:59.663 "data_size": 63488 00:15:59.663 }, 00:15:59.663 { 00:15:59.663 "name": "pt3", 00:15:59.663 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:59.663 "is_configured": true, 00:15:59.663 "data_offset": 2048, 00:15:59.663 "data_size": 63488 00:15:59.663 } 00:15:59.663 ] 00:15:59.663 } 00:15:59.663 } 00:15:59.663 }' 00:15:59.663 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:59.663 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:59.663 pt2 00:15:59.663 pt3' 00:15:59.663 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:59.664 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:59.664 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:59.933 "name": "pt1", 00:15:59.933 "aliases": [ 00:15:59.933 "00000000-0000-0000-0000-000000000001" 00:15:59.933 ], 00:15:59.933 "product_name": "passthru", 00:15:59.933 "block_size": 512, 00:15:59.933 "num_blocks": 65536, 00:15:59.933 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:59.933 "assigned_rate_limits": { 00:15:59.933 "rw_ios_per_sec": 0, 00:15:59.933 "rw_mbytes_per_sec": 0, 00:15:59.933 "r_mbytes_per_sec": 0, 00:15:59.933 "w_mbytes_per_sec": 0 00:15:59.933 }, 00:15:59.933 "claimed": true, 00:15:59.933 "claim_type": "exclusive_write", 00:15:59.933 "zoned": false, 00:15:59.933 "supported_io_types": { 00:15:59.933 "read": true, 00:15:59.933 "write": true, 00:15:59.933 "unmap": true, 00:15:59.933 "flush": true, 00:15:59.933 "reset": true, 00:15:59.933 "nvme_admin": false, 00:15:59.933 "nvme_io": false, 00:15:59.933 "nvme_io_md": false, 00:15:59.933 "write_zeroes": true, 00:15:59.933 "zcopy": true, 00:15:59.933 "get_zone_info": false, 00:15:59.933 "zone_management": false, 00:15:59.933 "zone_append": false, 00:15:59.933 "compare": false, 00:15:59.933 "compare_and_write": false, 00:15:59.933 "abort": true, 00:15:59.933 "seek_hole": false, 00:15:59.933 "seek_data": false, 00:15:59.933 "copy": true, 00:15:59.933 "nvme_iov_md": false 00:15:59.933 }, 00:15:59.933 "memory_domains": [ 00:15:59.933 { 00:15:59.933 "dma_device_id": "system", 00:15:59.933 "dma_device_type": 1 00:15:59.933 }, 00:15:59.933 { 00:15:59.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.933 "dma_device_type": 2 00:15:59.933 } 00:15:59.933 ], 00:15:59.933 "driver_specific": { 00:15:59.933 "passthru": { 00:15:59.933 "name": "pt1", 00:15:59.933 "base_bdev_name": "malloc1" 00:15:59.933 } 00:15:59.933 } 00:15:59.933 }' 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:59.933 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.193 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:00.193 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.193 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.193 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:00.193 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:00.193 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:00.193 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:00.474 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:00.474 "name": "pt2", 00:16:00.474 "aliases": [ 00:16:00.474 "00000000-0000-0000-0000-000000000002" 00:16:00.474 ], 00:16:00.474 "product_name": "passthru", 00:16:00.474 "block_size": 512, 00:16:00.474 "num_blocks": 65536, 00:16:00.474 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:00.474 "assigned_rate_limits": { 00:16:00.474 "rw_ios_per_sec": 0, 00:16:00.474 "rw_mbytes_per_sec": 0, 00:16:00.474 "r_mbytes_per_sec": 0, 00:16:00.474 "w_mbytes_per_sec": 0 00:16:00.474 }, 00:16:00.474 "claimed": true, 00:16:00.474 "claim_type": "exclusive_write", 00:16:00.474 "zoned": false, 00:16:00.474 "supported_io_types": { 00:16:00.474 "read": true, 00:16:00.474 "write": true, 00:16:00.474 "unmap": true, 00:16:00.474 "flush": true, 00:16:00.474 "reset": true, 00:16:00.474 "nvme_admin": false, 00:16:00.474 "nvme_io": false, 00:16:00.474 "nvme_io_md": false, 00:16:00.474 "write_zeroes": true, 00:16:00.474 "zcopy": true, 00:16:00.474 "get_zone_info": false, 00:16:00.474 "zone_management": false, 00:16:00.474 "zone_append": false, 00:16:00.474 "compare": false, 00:16:00.474 "compare_and_write": false, 00:16:00.474 "abort": true, 00:16:00.474 "seek_hole": false, 00:16:00.474 "seek_data": false, 00:16:00.474 "copy": true, 00:16:00.474 "nvme_iov_md": false 00:16:00.474 }, 00:16:00.474 "memory_domains": [ 00:16:00.474 { 00:16:00.474 "dma_device_id": "system", 00:16:00.474 "dma_device_type": 1 00:16:00.474 }, 00:16:00.474 { 00:16:00.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.474 "dma_device_type": 2 00:16:00.474 } 00:16:00.474 ], 00:16:00.474 "driver_specific": { 00:16:00.474 "passthru": { 00:16:00.474 "name": "pt2", 00:16:00.474 "base_bdev_name": "malloc2" 00:16:00.474 } 00:16:00.474 } 00:16:00.474 }' 00:16:00.474 07:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.474 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.474 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:00.474 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.474 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.474 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:00.474 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.474 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.733 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:00.733 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.733 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.733 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:00.733 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:00.733 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:00.733 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:00.992 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:00.992 "name": "pt3", 00:16:00.992 "aliases": [ 00:16:00.992 "00000000-0000-0000-0000-000000000003" 00:16:00.992 ], 00:16:00.992 "product_name": "passthru", 00:16:00.992 "block_size": 512, 00:16:00.992 "num_blocks": 65536, 00:16:00.992 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:00.992 "assigned_rate_limits": { 00:16:00.992 "rw_ios_per_sec": 0, 00:16:00.992 "rw_mbytes_per_sec": 0, 00:16:00.992 "r_mbytes_per_sec": 0, 00:16:00.992 "w_mbytes_per_sec": 0 00:16:00.992 }, 00:16:00.992 "claimed": true, 00:16:00.992 "claim_type": "exclusive_write", 00:16:00.992 "zoned": false, 00:16:00.992 "supported_io_types": { 00:16:00.992 "read": true, 00:16:00.992 "write": true, 00:16:00.992 "unmap": true, 00:16:00.992 "flush": true, 00:16:00.992 "reset": true, 00:16:00.992 "nvme_admin": false, 00:16:00.992 "nvme_io": false, 00:16:00.992 "nvme_io_md": false, 00:16:00.992 "write_zeroes": true, 00:16:00.992 "zcopy": true, 00:16:00.992 "get_zone_info": false, 00:16:00.992 "zone_management": false, 00:16:00.992 "zone_append": false, 00:16:00.992 "compare": false, 00:16:00.992 "compare_and_write": false, 00:16:00.992 "abort": true, 00:16:00.992 "seek_hole": false, 00:16:00.992 "seek_data": false, 00:16:00.992 "copy": true, 00:16:00.992 "nvme_iov_md": false 00:16:00.992 }, 00:16:00.992 "memory_domains": [ 00:16:00.992 { 00:16:00.992 "dma_device_id": "system", 00:16:00.992 "dma_device_type": 1 00:16:00.992 }, 00:16:00.992 { 00:16:00.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.993 "dma_device_type": 2 00:16:00.993 } 00:16:00.993 ], 00:16:00.993 "driver_specific": { 00:16:00.993 "passthru": { 00:16:00.993 "name": "pt3", 00:16:00.993 "base_bdev_name": "malloc3" 00:16:00.993 } 00:16:00.993 } 00:16:00.993 }' 00:16:00.993 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.993 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.993 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:00.993 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.993 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.993 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:00.993 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.252 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.252 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:01.252 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.252 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.252 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:01.252 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:01.252 07:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:01.511 [2024-07-15 07:51:46.053156] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:01.511 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=82675149-81d9-4128-866f-38a7bc166790 00:16:01.511 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 82675149-81d9-4128-866f-38a7bc166790 ']' 00:16:01.511 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:01.511 [2024-07-15 07:51:46.245425] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:01.511 [2024-07-15 07:51:46.245440] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:01.511 [2024-07-15 07:51:46.245478] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:01.511 [2024-07-15 07:51:46.245529] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:01.511 [2024-07-15 07:51:46.245536] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f66a90 name raid_bdev1, state offline 00:16:01.511 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.511 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:01.770 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:01.770 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:01.770 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:01.770 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:02.030 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:02.030 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:02.289 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:02.289 07:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:02.289 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:02.289 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:02.550 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:02.810 [2024-07-15 07:51:47.384261] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:02.810 [2024-07-15 07:51:47.385321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:02.810 [2024-07-15 07:51:47.385355] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:02.810 [2024-07-15 07:51:47.385389] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:02.810 [2024-07-15 07:51:47.385416] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:02.810 [2024-07-15 07:51:47.385430] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:02.810 [2024-07-15 07:51:47.385439] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:02.810 [2024-07-15 07:51:47.385445] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f62bf0 name raid_bdev1, state configuring 00:16:02.810 request: 00:16:02.810 { 00:16:02.810 "name": "raid_bdev1", 00:16:02.810 "raid_level": "raid1", 00:16:02.810 "base_bdevs": [ 00:16:02.810 "malloc1", 00:16:02.810 "malloc2", 00:16:02.810 "malloc3" 00:16:02.810 ], 00:16:02.810 "superblock": false, 00:16:02.810 "method": "bdev_raid_create", 00:16:02.810 "req_id": 1 00:16:02.810 } 00:16:02.810 Got JSON-RPC error response 00:16:02.810 response: 00:16:02.810 { 00:16:02.810 "code": -17, 00:16:02.810 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:02.810 } 00:16:02.810 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:02.810 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:02.810 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:02.810 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:02.810 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.810 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:03.070 [2024-07-15 07:51:47.769190] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:03.070 [2024-07-15 07:51:47.769217] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:03.070 [2024-07-15 07:51:47.769227] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dbae00 00:16:03.070 [2024-07-15 07:51:47.769233] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:03.070 [2024-07-15 07:51:47.770490] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:03.070 [2024-07-15 07:51:47.770510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:03.070 [2024-07-15 07:51:47.770555] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:03.070 [2024-07-15 07:51:47.770572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:03.070 pt1 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.070 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:03.331 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.331 "name": "raid_bdev1", 00:16:03.331 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:03.331 "strip_size_kb": 0, 00:16:03.331 "state": "configuring", 00:16:03.331 "raid_level": "raid1", 00:16:03.331 "superblock": true, 00:16:03.331 "num_base_bdevs": 3, 00:16:03.331 "num_base_bdevs_discovered": 1, 00:16:03.331 "num_base_bdevs_operational": 3, 00:16:03.331 "base_bdevs_list": [ 00:16:03.331 { 00:16:03.331 "name": "pt1", 00:16:03.331 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:03.331 "is_configured": true, 00:16:03.331 "data_offset": 2048, 00:16:03.331 "data_size": 63488 00:16:03.331 }, 00:16:03.331 { 00:16:03.331 "name": null, 00:16:03.331 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:03.331 "is_configured": false, 00:16:03.331 "data_offset": 2048, 00:16:03.331 "data_size": 63488 00:16:03.331 }, 00:16:03.331 { 00:16:03.331 "name": null, 00:16:03.331 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:03.331 "is_configured": false, 00:16:03.331 "data_offset": 2048, 00:16:03.331 "data_size": 63488 00:16:03.331 } 00:16:03.331 ] 00:16:03.331 }' 00:16:03.331 07:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.331 07:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.901 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:03.901 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:03.901 [2024-07-15 07:51:48.627368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:03.901 [2024-07-15 07:51:48.627399] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:03.901 [2024-07-15 07:51:48.627410] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db9c50 00:16:03.901 [2024-07-15 07:51:48.627417] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:03.901 [2024-07-15 07:51:48.627673] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:03.901 [2024-07-15 07:51:48.627685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:03.901 [2024-07-15 07:51:48.627731] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:03.901 [2024-07-15 07:51:48.627743] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:03.901 pt2 00:16:03.901 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:04.162 [2024-07-15 07:51:48.823869] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.162 07:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:04.422 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.422 "name": "raid_bdev1", 00:16:04.422 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:04.422 "strip_size_kb": 0, 00:16:04.422 "state": "configuring", 00:16:04.422 "raid_level": "raid1", 00:16:04.422 "superblock": true, 00:16:04.422 "num_base_bdevs": 3, 00:16:04.422 "num_base_bdevs_discovered": 1, 00:16:04.422 "num_base_bdevs_operational": 3, 00:16:04.422 "base_bdevs_list": [ 00:16:04.422 { 00:16:04.422 "name": "pt1", 00:16:04.422 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:04.422 "is_configured": true, 00:16:04.422 "data_offset": 2048, 00:16:04.422 "data_size": 63488 00:16:04.422 }, 00:16:04.422 { 00:16:04.422 "name": null, 00:16:04.422 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:04.422 "is_configured": false, 00:16:04.422 "data_offset": 2048, 00:16:04.422 "data_size": 63488 00:16:04.422 }, 00:16:04.422 { 00:16:04.422 "name": null, 00:16:04.422 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:04.422 "is_configured": false, 00:16:04.422 "data_offset": 2048, 00:16:04.422 "data_size": 63488 00:16:04.422 } 00:16:04.422 ] 00:16:04.422 }' 00:16:04.422 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.422 07:51:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.992 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:04.992 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:04.992 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:04.992 [2024-07-15 07:51:49.734172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:04.992 [2024-07-15 07:51:49.734208] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:04.992 [2024-07-15 07:51:49.734222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f68880 00:16:04.992 [2024-07-15 07:51:49.734228] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:04.992 [2024-07-15 07:51:49.734494] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:04.992 [2024-07-15 07:51:49.734506] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:04.992 [2024-07-15 07:51:49.734552] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:04.992 [2024-07-15 07:51:49.734565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:04.992 pt2 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:05.253 [2024-07-15 07:51:49.926657] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:05.253 [2024-07-15 07:51:49.926677] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:05.253 [2024-07-15 07:51:49.926685] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f64920 00:16:05.253 [2024-07-15 07:51:49.926691] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:05.253 [2024-07-15 07:51:49.926917] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:05.253 [2024-07-15 07:51:49.926928] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:05.253 [2024-07-15 07:51:49.926960] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:05.253 [2024-07-15 07:51:49.926976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:05.253 [2024-07-15 07:51:49.927056] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f64100 00:16:05.253 [2024-07-15 07:51:49.927062] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:05.253 [2024-07-15 07:51:49.927197] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f6b620 00:16:05.253 [2024-07-15 07:51:49.927300] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f64100 00:16:05.253 [2024-07-15 07:51:49.927305] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f64100 00:16:05.253 [2024-07-15 07:51:49.927374] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:05.253 pt3 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:05.253 07:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.513 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.513 "name": "raid_bdev1", 00:16:05.513 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:05.513 "strip_size_kb": 0, 00:16:05.513 "state": "online", 00:16:05.513 "raid_level": "raid1", 00:16:05.513 "superblock": true, 00:16:05.513 "num_base_bdevs": 3, 00:16:05.513 "num_base_bdevs_discovered": 3, 00:16:05.513 "num_base_bdevs_operational": 3, 00:16:05.513 "base_bdevs_list": [ 00:16:05.513 { 00:16:05.513 "name": "pt1", 00:16:05.513 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:05.513 "is_configured": true, 00:16:05.513 "data_offset": 2048, 00:16:05.513 "data_size": 63488 00:16:05.513 }, 00:16:05.513 { 00:16:05.513 "name": "pt2", 00:16:05.513 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:05.513 "is_configured": true, 00:16:05.513 "data_offset": 2048, 00:16:05.513 "data_size": 63488 00:16:05.513 }, 00:16:05.513 { 00:16:05.513 "name": "pt3", 00:16:05.513 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:05.513 "is_configured": true, 00:16:05.513 "data_offset": 2048, 00:16:05.513 "data_size": 63488 00:16:05.513 } 00:16:05.513 ] 00:16:05.513 }' 00:16:05.513 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.513 07:51:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.083 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:06.083 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:06.083 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:06.083 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:06.083 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:06.083 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:06.083 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:06.083 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:06.344 [2024-07-15 07:51:50.865259] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:06.344 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:06.344 "name": "raid_bdev1", 00:16:06.344 "aliases": [ 00:16:06.344 "82675149-81d9-4128-866f-38a7bc166790" 00:16:06.344 ], 00:16:06.344 "product_name": "Raid Volume", 00:16:06.344 "block_size": 512, 00:16:06.344 "num_blocks": 63488, 00:16:06.344 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:06.344 "assigned_rate_limits": { 00:16:06.344 "rw_ios_per_sec": 0, 00:16:06.344 "rw_mbytes_per_sec": 0, 00:16:06.344 "r_mbytes_per_sec": 0, 00:16:06.344 "w_mbytes_per_sec": 0 00:16:06.344 }, 00:16:06.344 "claimed": false, 00:16:06.344 "zoned": false, 00:16:06.344 "supported_io_types": { 00:16:06.344 "read": true, 00:16:06.344 "write": true, 00:16:06.344 "unmap": false, 00:16:06.344 "flush": false, 00:16:06.344 "reset": true, 00:16:06.344 "nvme_admin": false, 00:16:06.344 "nvme_io": false, 00:16:06.344 "nvme_io_md": false, 00:16:06.344 "write_zeroes": true, 00:16:06.344 "zcopy": false, 00:16:06.344 "get_zone_info": false, 00:16:06.344 "zone_management": false, 00:16:06.344 "zone_append": false, 00:16:06.344 "compare": false, 00:16:06.344 "compare_and_write": false, 00:16:06.344 "abort": false, 00:16:06.344 "seek_hole": false, 00:16:06.344 "seek_data": false, 00:16:06.344 "copy": false, 00:16:06.344 "nvme_iov_md": false 00:16:06.344 }, 00:16:06.344 "memory_domains": [ 00:16:06.344 { 00:16:06.344 "dma_device_id": "system", 00:16:06.344 "dma_device_type": 1 00:16:06.344 }, 00:16:06.344 { 00:16:06.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.344 "dma_device_type": 2 00:16:06.344 }, 00:16:06.344 { 00:16:06.344 "dma_device_id": "system", 00:16:06.344 "dma_device_type": 1 00:16:06.344 }, 00:16:06.344 { 00:16:06.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.344 "dma_device_type": 2 00:16:06.344 }, 00:16:06.344 { 00:16:06.344 "dma_device_id": "system", 00:16:06.344 "dma_device_type": 1 00:16:06.344 }, 00:16:06.344 { 00:16:06.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.344 "dma_device_type": 2 00:16:06.344 } 00:16:06.344 ], 00:16:06.344 "driver_specific": { 00:16:06.344 "raid": { 00:16:06.344 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:06.344 "strip_size_kb": 0, 00:16:06.344 "state": "online", 00:16:06.344 "raid_level": "raid1", 00:16:06.344 "superblock": true, 00:16:06.344 "num_base_bdevs": 3, 00:16:06.344 "num_base_bdevs_discovered": 3, 00:16:06.344 "num_base_bdevs_operational": 3, 00:16:06.344 "base_bdevs_list": [ 00:16:06.344 { 00:16:06.344 "name": "pt1", 00:16:06.344 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:06.344 "is_configured": true, 00:16:06.344 "data_offset": 2048, 00:16:06.344 "data_size": 63488 00:16:06.344 }, 00:16:06.344 { 00:16:06.344 "name": "pt2", 00:16:06.344 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:06.344 "is_configured": true, 00:16:06.344 "data_offset": 2048, 00:16:06.344 "data_size": 63488 00:16:06.344 }, 00:16:06.344 { 00:16:06.344 "name": "pt3", 00:16:06.344 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:06.344 "is_configured": true, 00:16:06.344 "data_offset": 2048, 00:16:06.344 "data_size": 63488 00:16:06.344 } 00:16:06.344 ] 00:16:06.344 } 00:16:06.344 } 00:16:06.344 }' 00:16:06.344 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:06.344 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:06.344 pt2 00:16:06.344 pt3' 00:16:06.344 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.344 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:06.344 07:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.604 "name": "pt1", 00:16:06.604 "aliases": [ 00:16:06.604 "00000000-0000-0000-0000-000000000001" 00:16:06.604 ], 00:16:06.604 "product_name": "passthru", 00:16:06.604 "block_size": 512, 00:16:06.604 "num_blocks": 65536, 00:16:06.604 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:06.604 "assigned_rate_limits": { 00:16:06.604 "rw_ios_per_sec": 0, 00:16:06.604 "rw_mbytes_per_sec": 0, 00:16:06.604 "r_mbytes_per_sec": 0, 00:16:06.604 "w_mbytes_per_sec": 0 00:16:06.604 }, 00:16:06.604 "claimed": true, 00:16:06.604 "claim_type": "exclusive_write", 00:16:06.604 "zoned": false, 00:16:06.604 "supported_io_types": { 00:16:06.604 "read": true, 00:16:06.604 "write": true, 00:16:06.604 "unmap": true, 00:16:06.604 "flush": true, 00:16:06.604 "reset": true, 00:16:06.604 "nvme_admin": false, 00:16:06.604 "nvme_io": false, 00:16:06.604 "nvme_io_md": false, 00:16:06.604 "write_zeroes": true, 00:16:06.604 "zcopy": true, 00:16:06.604 "get_zone_info": false, 00:16:06.604 "zone_management": false, 00:16:06.604 "zone_append": false, 00:16:06.604 "compare": false, 00:16:06.604 "compare_and_write": false, 00:16:06.604 "abort": true, 00:16:06.604 "seek_hole": false, 00:16:06.604 "seek_data": false, 00:16:06.604 "copy": true, 00:16:06.604 "nvme_iov_md": false 00:16:06.604 }, 00:16:06.604 "memory_domains": [ 00:16:06.604 { 00:16:06.604 "dma_device_id": "system", 00:16:06.604 "dma_device_type": 1 00:16:06.604 }, 00:16:06.604 { 00:16:06.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.604 "dma_device_type": 2 00:16:06.604 } 00:16:06.604 ], 00:16:06.604 "driver_specific": { 00:16:06.604 "passthru": { 00:16:06.604 "name": "pt1", 00:16:06.604 "base_bdev_name": "malloc1" 00:16:06.604 } 00:16:06.604 } 00:16:06.604 }' 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.604 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.865 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.865 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.865 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.865 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.865 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:06.865 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.865 "name": "pt2", 00:16:06.865 "aliases": [ 00:16:06.865 "00000000-0000-0000-0000-000000000002" 00:16:06.865 ], 00:16:06.865 "product_name": "passthru", 00:16:06.865 "block_size": 512, 00:16:06.865 "num_blocks": 65536, 00:16:06.865 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:06.865 "assigned_rate_limits": { 00:16:06.865 "rw_ios_per_sec": 0, 00:16:06.865 "rw_mbytes_per_sec": 0, 00:16:06.865 "r_mbytes_per_sec": 0, 00:16:06.865 "w_mbytes_per_sec": 0 00:16:06.865 }, 00:16:06.865 "claimed": true, 00:16:06.865 "claim_type": "exclusive_write", 00:16:06.865 "zoned": false, 00:16:06.865 "supported_io_types": { 00:16:06.865 "read": true, 00:16:06.865 "write": true, 00:16:06.865 "unmap": true, 00:16:06.865 "flush": true, 00:16:06.865 "reset": true, 00:16:06.865 "nvme_admin": false, 00:16:06.865 "nvme_io": false, 00:16:06.865 "nvme_io_md": false, 00:16:06.865 "write_zeroes": true, 00:16:06.865 "zcopy": true, 00:16:06.865 "get_zone_info": false, 00:16:06.865 "zone_management": false, 00:16:06.865 "zone_append": false, 00:16:06.865 "compare": false, 00:16:06.865 "compare_and_write": false, 00:16:06.865 "abort": true, 00:16:06.865 "seek_hole": false, 00:16:06.865 "seek_data": false, 00:16:06.865 "copy": true, 00:16:06.865 "nvme_iov_md": false 00:16:06.865 }, 00:16:06.865 "memory_domains": [ 00:16:06.865 { 00:16:06.865 "dma_device_id": "system", 00:16:06.865 "dma_device_type": 1 00:16:06.865 }, 00:16:06.865 { 00:16:06.865 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.865 "dma_device_type": 2 00:16:06.865 } 00:16:06.865 ], 00:16:06.865 "driver_specific": { 00:16:06.865 "passthru": { 00:16:06.865 "name": "pt2", 00:16:06.865 "base_bdev_name": "malloc2" 00:16:06.865 } 00:16:06.865 } 00:16:06.865 }' 00:16:06.865 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.125 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.384 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.384 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.384 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.384 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:07.384 07:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.644 "name": "pt3", 00:16:07.644 "aliases": [ 00:16:07.644 "00000000-0000-0000-0000-000000000003" 00:16:07.644 ], 00:16:07.644 "product_name": "passthru", 00:16:07.644 "block_size": 512, 00:16:07.644 "num_blocks": 65536, 00:16:07.644 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:07.644 "assigned_rate_limits": { 00:16:07.644 "rw_ios_per_sec": 0, 00:16:07.644 "rw_mbytes_per_sec": 0, 00:16:07.644 "r_mbytes_per_sec": 0, 00:16:07.644 "w_mbytes_per_sec": 0 00:16:07.644 }, 00:16:07.644 "claimed": true, 00:16:07.644 "claim_type": "exclusive_write", 00:16:07.644 "zoned": false, 00:16:07.644 "supported_io_types": { 00:16:07.644 "read": true, 00:16:07.644 "write": true, 00:16:07.644 "unmap": true, 00:16:07.644 "flush": true, 00:16:07.644 "reset": true, 00:16:07.644 "nvme_admin": false, 00:16:07.644 "nvme_io": false, 00:16:07.644 "nvme_io_md": false, 00:16:07.644 "write_zeroes": true, 00:16:07.644 "zcopy": true, 00:16:07.644 "get_zone_info": false, 00:16:07.644 "zone_management": false, 00:16:07.644 "zone_append": false, 00:16:07.644 "compare": false, 00:16:07.644 "compare_and_write": false, 00:16:07.644 "abort": true, 00:16:07.644 "seek_hole": false, 00:16:07.644 "seek_data": false, 00:16:07.644 "copy": true, 00:16:07.644 "nvme_iov_md": false 00:16:07.644 }, 00:16:07.644 "memory_domains": [ 00:16:07.644 { 00:16:07.644 "dma_device_id": "system", 00:16:07.644 "dma_device_type": 1 00:16:07.644 }, 00:16:07.644 { 00:16:07.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.644 "dma_device_type": 2 00:16:07.644 } 00:16:07.644 ], 00:16:07.644 "driver_specific": { 00:16:07.644 "passthru": { 00:16:07.644 "name": "pt3", 00:16:07.644 "base_bdev_name": "malloc3" 00:16:07.644 } 00:16:07.644 } 00:16:07.644 }' 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.644 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.904 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.904 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.904 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.904 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.904 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:07.904 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:08.165 [2024-07-15 07:51:52.673842] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 82675149-81d9-4128-866f-38a7bc166790 '!=' 82675149-81d9-4128-866f-38a7bc166790 ']' 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:08.165 [2024-07-15 07:51:52.866122] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:08.165 07:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.425 07:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.425 "name": "raid_bdev1", 00:16:08.425 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:08.425 "strip_size_kb": 0, 00:16:08.425 "state": "online", 00:16:08.425 "raid_level": "raid1", 00:16:08.425 "superblock": true, 00:16:08.425 "num_base_bdevs": 3, 00:16:08.425 "num_base_bdevs_discovered": 2, 00:16:08.425 "num_base_bdevs_operational": 2, 00:16:08.425 "base_bdevs_list": [ 00:16:08.425 { 00:16:08.425 "name": null, 00:16:08.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.425 "is_configured": false, 00:16:08.425 "data_offset": 2048, 00:16:08.425 "data_size": 63488 00:16:08.425 }, 00:16:08.425 { 00:16:08.425 "name": "pt2", 00:16:08.425 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:08.425 "is_configured": true, 00:16:08.425 "data_offset": 2048, 00:16:08.425 "data_size": 63488 00:16:08.425 }, 00:16:08.425 { 00:16:08.425 "name": "pt3", 00:16:08.425 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:08.425 "is_configured": true, 00:16:08.425 "data_offset": 2048, 00:16:08.425 "data_size": 63488 00:16:08.425 } 00:16:08.425 ] 00:16:08.425 }' 00:16:08.425 07:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.425 07:51:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:08.992 07:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:09.252 [2024-07-15 07:51:53.784442] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:09.252 [2024-07-15 07:51:53.784461] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:09.252 [2024-07-15 07:51:53.784496] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:09.252 [2024-07-15 07:51:53.784537] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:09.252 [2024-07-15 07:51:53.784543] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f64100 name raid_bdev1, state offline 00:16:09.252 07:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.252 07:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:16:09.252 07:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:16:09.252 07:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:16:09.252 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:16:09.252 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:09.252 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:09.512 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:16:09.512 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:09.512 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:09.795 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:16:09.795 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:09.795 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:16:09.795 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:16:09.795 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:10.055 [2024-07-15 07:51:54.554355] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:10.055 [2024-07-15 07:51:54.554387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:10.055 [2024-07-15 07:51:54.554397] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f6b420 00:16:10.055 [2024-07-15 07:51:54.554404] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:10.055 [2024-07-15 07:51:54.555703] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:10.055 [2024-07-15 07:51:54.555737] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:10.055 [2024-07-15 07:51:54.555784] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:10.055 [2024-07-15 07:51:54.555802] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:10.055 pt2 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:10.055 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.055 "name": "raid_bdev1", 00:16:10.055 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:10.055 "strip_size_kb": 0, 00:16:10.055 "state": "configuring", 00:16:10.055 "raid_level": "raid1", 00:16:10.055 "superblock": true, 00:16:10.055 "num_base_bdevs": 3, 00:16:10.055 "num_base_bdevs_discovered": 1, 00:16:10.055 "num_base_bdevs_operational": 2, 00:16:10.055 "base_bdevs_list": [ 00:16:10.055 { 00:16:10.055 "name": null, 00:16:10.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:10.055 "is_configured": false, 00:16:10.055 "data_offset": 2048, 00:16:10.055 "data_size": 63488 00:16:10.055 }, 00:16:10.055 { 00:16:10.055 "name": "pt2", 00:16:10.055 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:10.055 "is_configured": true, 00:16:10.055 "data_offset": 2048, 00:16:10.055 "data_size": 63488 00:16:10.055 }, 00:16:10.055 { 00:16:10.055 "name": null, 00:16:10.055 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:10.055 "is_configured": false, 00:16:10.055 "data_offset": 2048, 00:16:10.055 "data_size": 63488 00:16:10.055 } 00:16:10.055 ] 00:16:10.055 }' 00:16:10.056 07:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.056 07:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:10.625 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:16:10.625 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:16:10.625 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:16:10.625 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:10.886 [2024-07-15 07:51:55.428570] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:10.886 [2024-07-15 07:51:55.428606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:10.886 [2024-07-15 07:51:55.428618] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f63aa0 00:16:10.886 [2024-07-15 07:51:55.428625] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:10.886 [2024-07-15 07:51:55.428892] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:10.886 [2024-07-15 07:51:55.428904] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:10.886 [2024-07-15 07:51:55.428948] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:10.886 [2024-07-15 07:51:55.428961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:10.886 [2024-07-15 07:51:55.429035] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1db8360 00:16:10.886 [2024-07-15 07:51:55.429041] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:10.886 [2024-07-15 07:51:55.429173] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f6ab60 00:16:10.886 [2024-07-15 07:51:55.429273] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1db8360 00:16:10.886 [2024-07-15 07:51:55.429278] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1db8360 00:16:10.886 [2024-07-15 07:51:55.429348] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:10.886 pt3 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.886 "name": "raid_bdev1", 00:16:10.886 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:10.886 "strip_size_kb": 0, 00:16:10.886 "state": "online", 00:16:10.886 "raid_level": "raid1", 00:16:10.886 "superblock": true, 00:16:10.886 "num_base_bdevs": 3, 00:16:10.886 "num_base_bdevs_discovered": 2, 00:16:10.886 "num_base_bdevs_operational": 2, 00:16:10.886 "base_bdevs_list": [ 00:16:10.886 { 00:16:10.886 "name": null, 00:16:10.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:10.886 "is_configured": false, 00:16:10.886 "data_offset": 2048, 00:16:10.886 "data_size": 63488 00:16:10.886 }, 00:16:10.886 { 00:16:10.886 "name": "pt2", 00:16:10.886 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:10.886 "is_configured": true, 00:16:10.886 "data_offset": 2048, 00:16:10.886 "data_size": 63488 00:16:10.886 }, 00:16:10.886 { 00:16:10.886 "name": "pt3", 00:16:10.886 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:10.886 "is_configured": true, 00:16:10.886 "data_offset": 2048, 00:16:10.886 "data_size": 63488 00:16:10.886 } 00:16:10.886 ] 00:16:10.886 }' 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.886 07:51:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.457 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:11.717 [2024-07-15 07:51:56.270681] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:11.717 [2024-07-15 07:51:56.270697] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:11.717 [2024-07-15 07:51:56.270741] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:11.717 [2024-07-15 07:51:56.270782] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:11.717 [2024-07-15 07:51:56.270788] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1db8360 name raid_bdev1, state offline 00:16:11.717 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.717 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:16:11.717 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:16:11.717 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:16:11.717 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:16:11.717 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:16:11.717 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:11.978 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:12.238 [2024-07-15 07:51:56.767929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:12.238 [2024-07-15 07:51:56.767959] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:12.238 [2024-07-15 07:51:56.767970] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f65330 00:16:12.238 [2024-07-15 07:51:56.767976] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:12.238 [2024-07-15 07:51:56.769225] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:12.238 [2024-07-15 07:51:56.769245] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:12.238 [2024-07-15 07:51:56.769290] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:12.238 [2024-07-15 07:51:56.769308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:12.238 [2024-07-15 07:51:56.769378] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:16:12.238 [2024-07-15 07:51:56.769386] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:12.238 [2024-07-15 07:51:56.769394] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f68c50 name raid_bdev1, state configuring 00:16:12.238 [2024-07-15 07:51:56.769409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:12.238 pt1 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.238 "name": "raid_bdev1", 00:16:12.238 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:12.238 "strip_size_kb": 0, 00:16:12.238 "state": "configuring", 00:16:12.238 "raid_level": "raid1", 00:16:12.238 "superblock": true, 00:16:12.238 "num_base_bdevs": 3, 00:16:12.238 "num_base_bdevs_discovered": 1, 00:16:12.238 "num_base_bdevs_operational": 2, 00:16:12.238 "base_bdevs_list": [ 00:16:12.238 { 00:16:12.238 "name": null, 00:16:12.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.238 "is_configured": false, 00:16:12.238 "data_offset": 2048, 00:16:12.238 "data_size": 63488 00:16:12.238 }, 00:16:12.238 { 00:16:12.238 "name": "pt2", 00:16:12.238 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:12.238 "is_configured": true, 00:16:12.238 "data_offset": 2048, 00:16:12.238 "data_size": 63488 00:16:12.238 }, 00:16:12.238 { 00:16:12.238 "name": null, 00:16:12.238 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:12.238 "is_configured": false, 00:16:12.238 "data_offset": 2048, 00:16:12.238 "data_size": 63488 00:16:12.238 } 00:16:12.238 ] 00:16:12.238 }' 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.238 07:51:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:12.808 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:16:12.808 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:16:13.067 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:16:13.067 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:13.327 [2024-07-15 07:51:57.830637] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:13.327 [2024-07-15 07:51:57.830675] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:13.327 [2024-07-15 07:51:57.830686] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db90f0 00:16:13.327 [2024-07-15 07:51:57.830692] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:13.327 [2024-07-15 07:51:57.830960] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:13.327 [2024-07-15 07:51:57.830974] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:13.327 [2024-07-15 07:51:57.831018] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:13.327 [2024-07-15 07:51:57.831032] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:13.327 [2024-07-15 07:51:57.831107] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e4b360 00:16:13.327 [2024-07-15 07:51:57.831113] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:13.327 [2024-07-15 07:51:57.831243] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f69ac0 00:16:13.327 [2024-07-15 07:51:57.831342] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e4b360 00:16:13.327 [2024-07-15 07:51:57.831348] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e4b360 00:16:13.327 [2024-07-15 07:51:57.831419] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:13.327 pt3 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.327 07:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:13.327 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.327 "name": "raid_bdev1", 00:16:13.327 "uuid": "82675149-81d9-4128-866f-38a7bc166790", 00:16:13.327 "strip_size_kb": 0, 00:16:13.327 "state": "online", 00:16:13.327 "raid_level": "raid1", 00:16:13.327 "superblock": true, 00:16:13.327 "num_base_bdevs": 3, 00:16:13.327 "num_base_bdevs_discovered": 2, 00:16:13.327 "num_base_bdevs_operational": 2, 00:16:13.327 "base_bdevs_list": [ 00:16:13.327 { 00:16:13.327 "name": null, 00:16:13.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.327 "is_configured": false, 00:16:13.327 "data_offset": 2048, 00:16:13.327 "data_size": 63488 00:16:13.327 }, 00:16:13.327 { 00:16:13.327 "name": "pt2", 00:16:13.327 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:13.327 "is_configured": true, 00:16:13.327 "data_offset": 2048, 00:16:13.327 "data_size": 63488 00:16:13.327 }, 00:16:13.327 { 00:16:13.327 "name": "pt3", 00:16:13.327 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:13.327 "is_configured": true, 00:16:13.327 "data_offset": 2048, 00:16:13.327 "data_size": 63488 00:16:13.327 } 00:16:13.327 ] 00:16:13.327 }' 00:16:13.327 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.327 07:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.933 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:16:13.933 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:16:14.193 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:16:14.193 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:14.193 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:16:14.453 [2024-07-15 07:51:58.969702] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:14.453 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 82675149-81d9-4128-866f-38a7bc166790 '!=' 82675149-81d9-4128-866f-38a7bc166790 ']' 00:16:14.453 07:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1649014 00:16:14.453 07:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1649014 ']' 00:16:14.453 07:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1649014 00:16:14.453 07:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:14.453 07:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:14.453 07:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1649014 00:16:14.453 07:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:14.453 07:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:14.453 07:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1649014' 00:16:14.453 killing process with pid 1649014 00:16:14.453 07:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1649014 00:16:14.453 [2024-07-15 07:51:59.042556] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:14.453 [2024-07-15 07:51:59.042596] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:14.453 [2024-07-15 07:51:59.042639] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:14.453 [2024-07-15 07:51:59.042646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e4b360 name raid_bdev1, state offline 00:16:14.453 07:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1649014 00:16:14.453 [2024-07-15 07:51:59.057616] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:14.453 07:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:14.453 00:16:14.453 real 0m18.146s 00:16:14.453 user 0m33.809s 00:16:14.453 sys 0m2.722s 00:16:14.453 07:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:14.453 07:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.453 ************************************ 00:16:14.453 END TEST raid_superblock_test 00:16:14.453 ************************************ 00:16:14.714 07:51:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:14.714 07:51:59 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:16:14.714 07:51:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:14.714 07:51:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:14.714 07:51:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:14.714 ************************************ 00:16:14.714 START TEST raid_read_error_test 00:16:14.714 ************************************ 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.9lNW6Qu5Az 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1652528 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1652528 /var/tmp/spdk-raid.sock 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1652528 ']' 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:14.714 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:14.714 07:51:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.714 [2024-07-15 07:51:59.320406] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:16:14.714 [2024-07-15 07:51:59.320459] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1652528 ] 00:16:14.714 [2024-07-15 07:51:59.412634] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:14.974 [2024-07-15 07:51:59.490298] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:14.974 [2024-07-15 07:51:59.533839] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:14.974 [2024-07-15 07:51:59.533863] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:15.543 07:52:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:15.543 07:52:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:15.543 07:52:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:15.543 07:52:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:15.803 BaseBdev1_malloc 00:16:15.803 07:52:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:15.803 true 00:16:16.064 07:52:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:16.064 [2024-07-15 07:52:00.741464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:16.064 [2024-07-15 07:52:00.741498] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:16.064 [2024-07-15 07:52:00.741509] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24efb50 00:16:16.064 [2024-07-15 07:52:00.741515] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:16.064 [2024-07-15 07:52:00.742811] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:16.064 [2024-07-15 07:52:00.742830] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:16.064 BaseBdev1 00:16:16.064 07:52:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:16.064 07:52:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:16.324 BaseBdev2_malloc 00:16:16.325 07:52:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:16.586 true 00:16:16.586 07:52:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:16.586 [2024-07-15 07:52:01.320633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:16.586 [2024-07-15 07:52:01.320667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:16.586 [2024-07-15 07:52:01.320678] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d3ea0 00:16:16.586 [2024-07-15 07:52:01.320685] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:16.586 [2024-07-15 07:52:01.321834] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:16.586 [2024-07-15 07:52:01.321853] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:16.586 BaseBdev2 00:16:16.586 07:52:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:16.586 07:52:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:16.846 BaseBdev3_malloc 00:16:16.846 07:52:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:17.106 true 00:16:17.106 07:52:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:17.367 [2024-07-15 07:52:01.907743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:17.367 [2024-07-15 07:52:01.907769] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.367 [2024-07-15 07:52:01.907780] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24d7fb0 00:16:17.367 [2024-07-15 07:52:01.907786] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.367 [2024-07-15 07:52:01.908931] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.367 [2024-07-15 07:52:01.908949] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:17.367 BaseBdev3 00:16:17.367 07:52:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:17.367 [2024-07-15 07:52:02.100244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:17.367 [2024-07-15 07:52:02.101226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:17.367 [2024-07-15 07:52:02.101278] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:17.367 [2024-07-15 07:52:02.101432] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24d90e0 00:16:17.367 [2024-07-15 07:52:02.101440] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:17.367 [2024-07-15 07:52:02.101577] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24d69b0 00:16:17.367 [2024-07-15 07:52:02.101694] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24d90e0 00:16:17.367 [2024-07-15 07:52:02.101700] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24d90e0 00:16:17.367 [2024-07-15 07:52:02.101782] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.367 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.627 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.628 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:17.628 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.628 "name": "raid_bdev1", 00:16:17.628 "uuid": "2c8f152f-c938-4dcb-91d4-aa511ffe3e06", 00:16:17.628 "strip_size_kb": 0, 00:16:17.628 "state": "online", 00:16:17.628 "raid_level": "raid1", 00:16:17.628 "superblock": true, 00:16:17.628 "num_base_bdevs": 3, 00:16:17.628 "num_base_bdevs_discovered": 3, 00:16:17.628 "num_base_bdevs_operational": 3, 00:16:17.628 "base_bdevs_list": [ 00:16:17.628 { 00:16:17.628 "name": "BaseBdev1", 00:16:17.628 "uuid": "5ceffa12-7a93-5bf0-a102-d8c873c5b586", 00:16:17.628 "is_configured": true, 00:16:17.628 "data_offset": 2048, 00:16:17.628 "data_size": 63488 00:16:17.628 }, 00:16:17.628 { 00:16:17.628 "name": "BaseBdev2", 00:16:17.628 "uuid": "9c144690-0e6f-5a30-8eaf-5cea4d79e800", 00:16:17.628 "is_configured": true, 00:16:17.628 "data_offset": 2048, 00:16:17.628 "data_size": 63488 00:16:17.628 }, 00:16:17.628 { 00:16:17.628 "name": "BaseBdev3", 00:16:17.628 "uuid": "0ef4fc67-407e-5ddc-b492-541c5df39c93", 00:16:17.628 "is_configured": true, 00:16:17.628 "data_offset": 2048, 00:16:17.628 "data_size": 63488 00:16:17.628 } 00:16:17.628 ] 00:16:17.628 }' 00:16:17.628 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.628 07:52:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:18.197 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:18.197 07:52:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:18.197 [2024-07-15 07:52:02.942600] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24dd2e0 00:16:19.134 07:52:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:19.393 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:19.393 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:16:19.393 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.394 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:19.653 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.653 "name": "raid_bdev1", 00:16:19.653 "uuid": "2c8f152f-c938-4dcb-91d4-aa511ffe3e06", 00:16:19.653 "strip_size_kb": 0, 00:16:19.653 "state": "online", 00:16:19.653 "raid_level": "raid1", 00:16:19.653 "superblock": true, 00:16:19.653 "num_base_bdevs": 3, 00:16:19.653 "num_base_bdevs_discovered": 3, 00:16:19.653 "num_base_bdevs_operational": 3, 00:16:19.653 "base_bdevs_list": [ 00:16:19.653 { 00:16:19.653 "name": "BaseBdev1", 00:16:19.653 "uuid": "5ceffa12-7a93-5bf0-a102-d8c873c5b586", 00:16:19.653 "is_configured": true, 00:16:19.653 "data_offset": 2048, 00:16:19.653 "data_size": 63488 00:16:19.653 }, 00:16:19.653 { 00:16:19.653 "name": "BaseBdev2", 00:16:19.653 "uuid": "9c144690-0e6f-5a30-8eaf-5cea4d79e800", 00:16:19.653 "is_configured": true, 00:16:19.653 "data_offset": 2048, 00:16:19.653 "data_size": 63488 00:16:19.653 }, 00:16:19.653 { 00:16:19.653 "name": "BaseBdev3", 00:16:19.653 "uuid": "0ef4fc67-407e-5ddc-b492-541c5df39c93", 00:16:19.653 "is_configured": true, 00:16:19.653 "data_offset": 2048, 00:16:19.653 "data_size": 63488 00:16:19.653 } 00:16:19.653 ] 00:16:19.653 }' 00:16:19.653 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.653 07:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.220 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:20.220 [2024-07-15 07:52:04.943932] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:20.220 [2024-07-15 07:52:04.943964] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:20.220 [2024-07-15 07:52:04.946515] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:20.220 [2024-07-15 07:52:04.946542] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:20.220 [2024-07-15 07:52:04.946619] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:20.220 [2024-07-15 07:52:04.946625] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24d90e0 name raid_bdev1, state offline 00:16:20.220 0 00:16:20.480 07:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1652528 00:16:20.480 07:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1652528 ']' 00:16:20.480 07:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1652528 00:16:20.480 07:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:20.480 07:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:20.480 07:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1652528 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1652528' 00:16:20.480 killing process with pid 1652528 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1652528 00:16:20.480 [2024-07-15 07:52:05.032550] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1652528 00:16:20.480 [2024-07-15 07:52:05.044011] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.9lNW6Qu5Az 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:16:20.480 00:16:20.480 real 0m5.930s 00:16:20.480 user 0m9.460s 00:16:20.480 sys 0m0.855s 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:20.480 07:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.480 ************************************ 00:16:20.480 END TEST raid_read_error_test 00:16:20.480 ************************************ 00:16:20.480 07:52:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:20.480 07:52:05 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:16:20.480 07:52:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:20.480 07:52:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:20.480 07:52:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:20.740 ************************************ 00:16:20.740 START TEST raid_write_error_test 00:16:20.740 ************************************ 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.gPuMb9dAU1 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1653549 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1653549 /var/tmp/spdk-raid.sock 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1653549 ']' 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:20.740 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:20.740 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.740 [2024-07-15 07:52:05.323339] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:16:20.740 [2024-07-15 07:52:05.323395] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653549 ] 00:16:20.740 [2024-07-15 07:52:05.395505] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:20.740 [2024-07-15 07:52:05.458343] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:21.000 [2024-07-15 07:52:05.498496] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:21.000 [2024-07-15 07:52:05.498518] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:21.000 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:21.000 07:52:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:21.000 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:21.000 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:21.000 BaseBdev1_malloc 00:16:21.000 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:21.259 true 00:16:21.259 07:52:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:21.517 [2024-07-15 07:52:06.111651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:21.517 [2024-07-15 07:52:06.111686] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.518 [2024-07-15 07:52:06.111697] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x160fb50 00:16:21.518 [2024-07-15 07:52:06.111704] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.518 [2024-07-15 07:52:06.112994] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.518 [2024-07-15 07:52:06.113014] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:21.518 BaseBdev1 00:16:21.518 07:52:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:21.518 07:52:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:21.776 BaseBdev2_malloc 00:16:21.776 07:52:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:21.776 true 00:16:21.776 07:52:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:22.036 [2024-07-15 07:52:06.678694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:22.036 [2024-07-15 07:52:06.678723] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.036 [2024-07-15 07:52:06.678733] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15f3ea0 00:16:22.036 [2024-07-15 07:52:06.678739] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.036 [2024-07-15 07:52:06.679877] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.036 [2024-07-15 07:52:06.679895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:22.036 BaseBdev2 00:16:22.036 07:52:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:22.036 07:52:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:22.295 BaseBdev3_malloc 00:16:22.295 07:52:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:22.295 true 00:16:22.554 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:22.554 [2024-07-15 07:52:07.225641] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:22.554 [2024-07-15 07:52:07.225666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.555 [2024-07-15 07:52:07.225677] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15f7fb0 00:16:22.555 [2024-07-15 07:52:07.225683] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.555 [2024-07-15 07:52:07.226831] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.555 [2024-07-15 07:52:07.226848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:22.555 BaseBdev3 00:16:22.555 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:22.814 [2024-07-15 07:52:07.406207] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:22.814 [2024-07-15 07:52:07.407190] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:22.814 [2024-07-15 07:52:07.407242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:22.814 [2024-07-15 07:52:07.407393] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15f90e0 00:16:22.814 [2024-07-15 07:52:07.407400] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:22.814 [2024-07-15 07:52:07.407537] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15f69b0 00:16:22.814 [2024-07-15 07:52:07.407656] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15f90e0 00:16:22.814 [2024-07-15 07:52:07.407662] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15f90e0 00:16:22.814 [2024-07-15 07:52:07.407744] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.814 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:23.074 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.074 "name": "raid_bdev1", 00:16:23.074 "uuid": "791b9513-9a63-4e02-911a-b605794a5a78", 00:16:23.074 "strip_size_kb": 0, 00:16:23.074 "state": "online", 00:16:23.074 "raid_level": "raid1", 00:16:23.074 "superblock": true, 00:16:23.074 "num_base_bdevs": 3, 00:16:23.074 "num_base_bdevs_discovered": 3, 00:16:23.074 "num_base_bdevs_operational": 3, 00:16:23.074 "base_bdevs_list": [ 00:16:23.074 { 00:16:23.074 "name": "BaseBdev1", 00:16:23.074 "uuid": "3ec97b2e-dd40-5fc4-b634-55cf9591e952", 00:16:23.074 "is_configured": true, 00:16:23.074 "data_offset": 2048, 00:16:23.074 "data_size": 63488 00:16:23.074 }, 00:16:23.074 { 00:16:23.074 "name": "BaseBdev2", 00:16:23.074 "uuid": "8a8f2674-4611-54fe-b45c-52306d62e7b1", 00:16:23.074 "is_configured": true, 00:16:23.074 "data_offset": 2048, 00:16:23.074 "data_size": 63488 00:16:23.074 }, 00:16:23.074 { 00:16:23.074 "name": "BaseBdev3", 00:16:23.074 "uuid": "9db40025-a725-539f-b406-fb6111594cd1", 00:16:23.074 "is_configured": true, 00:16:23.074 "data_offset": 2048, 00:16:23.074 "data_size": 63488 00:16:23.074 } 00:16:23.074 ] 00:16:23.074 }' 00:16:23.074 07:52:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.074 07:52:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.641 07:52:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:23.641 07:52:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:23.641 [2024-07-15 07:52:08.232514] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15fd2e0 00:16:24.581 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:24.582 [2024-07-15 07:52:09.323968] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:16:24.582 [2024-07-15 07:52:09.324015] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:24.582 [2024-07-15 07:52:09.324191] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x15fd2e0 00:16:24.841 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:24.841 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:16:24.841 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:16:24.841 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:16:24.841 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:24.841 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.842 "name": "raid_bdev1", 00:16:24.842 "uuid": "791b9513-9a63-4e02-911a-b605794a5a78", 00:16:24.842 "strip_size_kb": 0, 00:16:24.842 "state": "online", 00:16:24.842 "raid_level": "raid1", 00:16:24.842 "superblock": true, 00:16:24.842 "num_base_bdevs": 3, 00:16:24.842 "num_base_bdevs_discovered": 2, 00:16:24.842 "num_base_bdevs_operational": 2, 00:16:24.842 "base_bdevs_list": [ 00:16:24.842 { 00:16:24.842 "name": null, 00:16:24.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:24.842 "is_configured": false, 00:16:24.842 "data_offset": 2048, 00:16:24.842 "data_size": 63488 00:16:24.842 }, 00:16:24.842 { 00:16:24.842 "name": "BaseBdev2", 00:16:24.842 "uuid": "8a8f2674-4611-54fe-b45c-52306d62e7b1", 00:16:24.842 "is_configured": true, 00:16:24.842 "data_offset": 2048, 00:16:24.842 "data_size": 63488 00:16:24.842 }, 00:16:24.842 { 00:16:24.842 "name": "BaseBdev3", 00:16:24.842 "uuid": "9db40025-a725-539f-b406-fb6111594cd1", 00:16:24.842 "is_configured": true, 00:16:24.842 "data_offset": 2048, 00:16:24.842 "data_size": 63488 00:16:24.842 } 00:16:24.842 ] 00:16:24.842 }' 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.842 07:52:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.411 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:25.672 [2024-07-15 07:52:10.186744] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:25.672 [2024-07-15 07:52:10.186777] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:25.672 [2024-07-15 07:52:10.189350] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:25.672 [2024-07-15 07:52:10.189373] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:25.672 [2024-07-15 07:52:10.189429] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:25.672 [2024-07-15 07:52:10.189435] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15f90e0 name raid_bdev1, state offline 00:16:25.672 0 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1653549 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1653549 ']' 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1653549 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1653549 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1653549' 00:16:25.672 killing process with pid 1653549 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1653549 00:16:25.672 [2024-07-15 07:52:10.275099] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1653549 00:16:25.672 [2024-07-15 07:52:10.286015] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.gPuMb9dAU1 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:16:25.672 00:16:25.672 real 0m5.168s 00:16:25.672 user 0m8.451s 00:16:25.672 sys 0m0.766s 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:25.672 07:52:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.672 ************************************ 00:16:25.672 END TEST raid_write_error_test 00:16:25.672 ************************************ 00:16:25.934 07:52:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:25.934 07:52:10 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:16:25.934 07:52:10 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:25.934 07:52:10 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:16:25.934 07:52:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:25.934 07:52:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:25.934 07:52:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:25.934 ************************************ 00:16:25.934 START TEST raid_state_function_test 00:16:25.934 ************************************ 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1654553 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1654553' 00:16:25.934 Process raid pid: 1654553 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1654553 /var/tmp/spdk-raid.sock 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1654553 ']' 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:25.934 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:25.934 07:52:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.934 [2024-07-15 07:52:10.560944] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:16:25.934 [2024-07-15 07:52:10.561001] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:25.934 [2024-07-15 07:52:10.654588] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:26.195 [2024-07-15 07:52:10.729692] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:26.195 [2024-07-15 07:52:10.779245] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:26.195 [2024-07-15 07:52:10.779270] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:26.764 07:52:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:26.764 07:52:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:26.764 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:27.024 [2024-07-15 07:52:11.551602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:27.024 [2024-07-15 07:52:11.551631] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:27.024 [2024-07-15 07:52:11.551637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:27.024 [2024-07-15 07:52:11.551644] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:27.024 [2024-07-15 07:52:11.551648] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:27.024 [2024-07-15 07:52:11.551654] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:27.024 [2024-07-15 07:52:11.551658] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:27.024 [2024-07-15 07:52:11.551664] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.024 "name": "Existed_Raid", 00:16:27.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.024 "strip_size_kb": 64, 00:16:27.024 "state": "configuring", 00:16:27.024 "raid_level": "raid0", 00:16:27.024 "superblock": false, 00:16:27.024 "num_base_bdevs": 4, 00:16:27.024 "num_base_bdevs_discovered": 0, 00:16:27.024 "num_base_bdevs_operational": 4, 00:16:27.024 "base_bdevs_list": [ 00:16:27.024 { 00:16:27.024 "name": "BaseBdev1", 00:16:27.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.024 "is_configured": false, 00:16:27.024 "data_offset": 0, 00:16:27.024 "data_size": 0 00:16:27.024 }, 00:16:27.024 { 00:16:27.024 "name": "BaseBdev2", 00:16:27.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.024 "is_configured": false, 00:16:27.024 "data_offset": 0, 00:16:27.024 "data_size": 0 00:16:27.024 }, 00:16:27.024 { 00:16:27.024 "name": "BaseBdev3", 00:16:27.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.024 "is_configured": false, 00:16:27.024 "data_offset": 0, 00:16:27.024 "data_size": 0 00:16:27.024 }, 00:16:27.024 { 00:16:27.024 "name": "BaseBdev4", 00:16:27.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.024 "is_configured": false, 00:16:27.024 "data_offset": 0, 00:16:27.024 "data_size": 0 00:16:27.024 } 00:16:27.024 ] 00:16:27.024 }' 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.024 07:52:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.595 07:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:27.898 [2024-07-15 07:52:12.453778] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:27.898 [2024-07-15 07:52:12.453796] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x220a6f0 name Existed_Raid, state configuring 00:16:27.898 07:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:27.898 [2024-07-15 07:52:12.646281] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:27.898 [2024-07-15 07:52:12.646298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:27.898 [2024-07-15 07:52:12.646303] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:27.898 [2024-07-15 07:52:12.646309] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:27.898 [2024-07-15 07:52:12.646314] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:27.898 [2024-07-15 07:52:12.646319] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:27.898 [2024-07-15 07:52:12.646324] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:27.898 [2024-07-15 07:52:12.646329] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:28.157 07:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:28.157 [2024-07-15 07:52:12.845125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:28.157 BaseBdev1 00:16:28.157 07:52:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:28.157 07:52:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:28.157 07:52:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:28.157 07:52:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:28.157 07:52:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:28.157 07:52:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:28.158 07:52:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:28.418 07:52:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:28.677 [ 00:16:28.677 { 00:16:28.677 "name": "BaseBdev1", 00:16:28.677 "aliases": [ 00:16:28.677 "41a295fc-291e-4b33-af7a-870322001483" 00:16:28.677 ], 00:16:28.677 "product_name": "Malloc disk", 00:16:28.677 "block_size": 512, 00:16:28.677 "num_blocks": 65536, 00:16:28.677 "uuid": "41a295fc-291e-4b33-af7a-870322001483", 00:16:28.677 "assigned_rate_limits": { 00:16:28.677 "rw_ios_per_sec": 0, 00:16:28.677 "rw_mbytes_per_sec": 0, 00:16:28.677 "r_mbytes_per_sec": 0, 00:16:28.677 "w_mbytes_per_sec": 0 00:16:28.677 }, 00:16:28.677 "claimed": true, 00:16:28.677 "claim_type": "exclusive_write", 00:16:28.677 "zoned": false, 00:16:28.677 "supported_io_types": { 00:16:28.677 "read": true, 00:16:28.677 "write": true, 00:16:28.677 "unmap": true, 00:16:28.677 "flush": true, 00:16:28.677 "reset": true, 00:16:28.677 "nvme_admin": false, 00:16:28.677 "nvme_io": false, 00:16:28.677 "nvme_io_md": false, 00:16:28.677 "write_zeroes": true, 00:16:28.677 "zcopy": true, 00:16:28.677 "get_zone_info": false, 00:16:28.677 "zone_management": false, 00:16:28.677 "zone_append": false, 00:16:28.677 "compare": false, 00:16:28.677 "compare_and_write": false, 00:16:28.677 "abort": true, 00:16:28.677 "seek_hole": false, 00:16:28.677 "seek_data": false, 00:16:28.677 "copy": true, 00:16:28.677 "nvme_iov_md": false 00:16:28.677 }, 00:16:28.677 "memory_domains": [ 00:16:28.677 { 00:16:28.677 "dma_device_id": "system", 00:16:28.677 "dma_device_type": 1 00:16:28.677 }, 00:16:28.677 { 00:16:28.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.677 "dma_device_type": 2 00:16:28.677 } 00:16:28.677 ], 00:16:28.677 "driver_specific": {} 00:16:28.677 } 00:16:28.677 ] 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.677 "name": "Existed_Raid", 00:16:28.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.677 "strip_size_kb": 64, 00:16:28.677 "state": "configuring", 00:16:28.677 "raid_level": "raid0", 00:16:28.677 "superblock": false, 00:16:28.677 "num_base_bdevs": 4, 00:16:28.677 "num_base_bdevs_discovered": 1, 00:16:28.677 "num_base_bdevs_operational": 4, 00:16:28.677 "base_bdevs_list": [ 00:16:28.677 { 00:16:28.677 "name": "BaseBdev1", 00:16:28.677 "uuid": "41a295fc-291e-4b33-af7a-870322001483", 00:16:28.677 "is_configured": true, 00:16:28.677 "data_offset": 0, 00:16:28.677 "data_size": 65536 00:16:28.677 }, 00:16:28.677 { 00:16:28.677 "name": "BaseBdev2", 00:16:28.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.677 "is_configured": false, 00:16:28.677 "data_offset": 0, 00:16:28.677 "data_size": 0 00:16:28.677 }, 00:16:28.677 { 00:16:28.677 "name": "BaseBdev3", 00:16:28.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.677 "is_configured": false, 00:16:28.677 "data_offset": 0, 00:16:28.677 "data_size": 0 00:16:28.677 }, 00:16:28.677 { 00:16:28.677 "name": "BaseBdev4", 00:16:28.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.677 "is_configured": false, 00:16:28.677 "data_offset": 0, 00:16:28.677 "data_size": 0 00:16:28.677 } 00:16:28.677 ] 00:16:28.677 }' 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.677 07:52:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.245 07:52:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:29.505 [2024-07-15 07:52:14.112319] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:29.505 [2024-07-15 07:52:14.112352] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2209f60 name Existed_Raid, state configuring 00:16:29.505 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:29.765 [2024-07-15 07:52:14.304858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:29.765 [2024-07-15 07:52:14.305964] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:29.765 [2024-07-15 07:52:14.305990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:29.765 [2024-07-15 07:52:14.305996] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:29.765 [2024-07-15 07:52:14.306002] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:29.765 [2024-07-15 07:52:14.306006] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:29.765 [2024-07-15 07:52:14.306012] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.765 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:30.025 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:30.025 "name": "Existed_Raid", 00:16:30.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.025 "strip_size_kb": 64, 00:16:30.025 "state": "configuring", 00:16:30.025 "raid_level": "raid0", 00:16:30.025 "superblock": false, 00:16:30.025 "num_base_bdevs": 4, 00:16:30.025 "num_base_bdevs_discovered": 1, 00:16:30.025 "num_base_bdevs_operational": 4, 00:16:30.025 "base_bdevs_list": [ 00:16:30.025 { 00:16:30.025 "name": "BaseBdev1", 00:16:30.025 "uuid": "41a295fc-291e-4b33-af7a-870322001483", 00:16:30.025 "is_configured": true, 00:16:30.025 "data_offset": 0, 00:16:30.025 "data_size": 65536 00:16:30.025 }, 00:16:30.025 { 00:16:30.025 "name": "BaseBdev2", 00:16:30.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.025 "is_configured": false, 00:16:30.025 "data_offset": 0, 00:16:30.025 "data_size": 0 00:16:30.025 }, 00:16:30.025 { 00:16:30.025 "name": "BaseBdev3", 00:16:30.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.025 "is_configured": false, 00:16:30.025 "data_offset": 0, 00:16:30.025 "data_size": 0 00:16:30.025 }, 00:16:30.025 { 00:16:30.025 "name": "BaseBdev4", 00:16:30.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.025 "is_configured": false, 00:16:30.025 "data_offset": 0, 00:16:30.025 "data_size": 0 00:16:30.025 } 00:16:30.025 ] 00:16:30.025 }' 00:16:30.025 07:52:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:30.025 07:52:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:30.595 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:30.595 [2024-07-15 07:52:15.255957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:30.595 BaseBdev2 00:16:30.595 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:30.595 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:30.595 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:30.595 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:30.595 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:30.595 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:30.595 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.854 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:31.113 [ 00:16:31.113 { 00:16:31.113 "name": "BaseBdev2", 00:16:31.113 "aliases": [ 00:16:31.113 "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8" 00:16:31.113 ], 00:16:31.113 "product_name": "Malloc disk", 00:16:31.113 "block_size": 512, 00:16:31.113 "num_blocks": 65536, 00:16:31.113 "uuid": "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8", 00:16:31.113 "assigned_rate_limits": { 00:16:31.113 "rw_ios_per_sec": 0, 00:16:31.113 "rw_mbytes_per_sec": 0, 00:16:31.113 "r_mbytes_per_sec": 0, 00:16:31.113 "w_mbytes_per_sec": 0 00:16:31.113 }, 00:16:31.113 "claimed": true, 00:16:31.113 "claim_type": "exclusive_write", 00:16:31.113 "zoned": false, 00:16:31.113 "supported_io_types": { 00:16:31.113 "read": true, 00:16:31.113 "write": true, 00:16:31.113 "unmap": true, 00:16:31.113 "flush": true, 00:16:31.113 "reset": true, 00:16:31.113 "nvme_admin": false, 00:16:31.113 "nvme_io": false, 00:16:31.113 "nvme_io_md": false, 00:16:31.113 "write_zeroes": true, 00:16:31.113 "zcopy": true, 00:16:31.113 "get_zone_info": false, 00:16:31.113 "zone_management": false, 00:16:31.113 "zone_append": false, 00:16:31.113 "compare": false, 00:16:31.113 "compare_and_write": false, 00:16:31.113 "abort": true, 00:16:31.113 "seek_hole": false, 00:16:31.113 "seek_data": false, 00:16:31.113 "copy": true, 00:16:31.113 "nvme_iov_md": false 00:16:31.113 }, 00:16:31.113 "memory_domains": [ 00:16:31.113 { 00:16:31.113 "dma_device_id": "system", 00:16:31.113 "dma_device_type": 1 00:16:31.113 }, 00:16:31.113 { 00:16:31.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.113 "dma_device_type": 2 00:16:31.113 } 00:16:31.113 ], 00:16:31.113 "driver_specific": {} 00:16:31.113 } 00:16:31.113 ] 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.113 "name": "Existed_Raid", 00:16:31.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.113 "strip_size_kb": 64, 00:16:31.113 "state": "configuring", 00:16:31.113 "raid_level": "raid0", 00:16:31.113 "superblock": false, 00:16:31.113 "num_base_bdevs": 4, 00:16:31.113 "num_base_bdevs_discovered": 2, 00:16:31.113 "num_base_bdevs_operational": 4, 00:16:31.113 "base_bdevs_list": [ 00:16:31.113 { 00:16:31.113 "name": "BaseBdev1", 00:16:31.113 "uuid": "41a295fc-291e-4b33-af7a-870322001483", 00:16:31.113 "is_configured": true, 00:16:31.113 "data_offset": 0, 00:16:31.113 "data_size": 65536 00:16:31.113 }, 00:16:31.113 { 00:16:31.113 "name": "BaseBdev2", 00:16:31.113 "uuid": "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8", 00:16:31.113 "is_configured": true, 00:16:31.113 "data_offset": 0, 00:16:31.113 "data_size": 65536 00:16:31.113 }, 00:16:31.113 { 00:16:31.113 "name": "BaseBdev3", 00:16:31.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.113 "is_configured": false, 00:16:31.113 "data_offset": 0, 00:16:31.113 "data_size": 0 00:16:31.113 }, 00:16:31.113 { 00:16:31.113 "name": "BaseBdev4", 00:16:31.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.113 "is_configured": false, 00:16:31.113 "data_offset": 0, 00:16:31.113 "data_size": 0 00:16:31.113 } 00:16:31.113 ] 00:16:31.113 }' 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.113 07:52:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.682 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:31.942 [2024-07-15 07:52:16.531830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:31.942 BaseBdev3 00:16:31.942 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:31.942 07:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:31.942 07:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:31.942 07:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:31.942 07:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:31.942 07:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:31.942 07:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:32.203 [ 00:16:32.203 { 00:16:32.203 "name": "BaseBdev3", 00:16:32.203 "aliases": [ 00:16:32.203 "f8ebb462-8474-4ff5-aafd-1bef6d7b0b23" 00:16:32.203 ], 00:16:32.203 "product_name": "Malloc disk", 00:16:32.203 "block_size": 512, 00:16:32.203 "num_blocks": 65536, 00:16:32.203 "uuid": "f8ebb462-8474-4ff5-aafd-1bef6d7b0b23", 00:16:32.203 "assigned_rate_limits": { 00:16:32.203 "rw_ios_per_sec": 0, 00:16:32.203 "rw_mbytes_per_sec": 0, 00:16:32.203 "r_mbytes_per_sec": 0, 00:16:32.203 "w_mbytes_per_sec": 0 00:16:32.203 }, 00:16:32.203 "claimed": true, 00:16:32.203 "claim_type": "exclusive_write", 00:16:32.203 "zoned": false, 00:16:32.203 "supported_io_types": { 00:16:32.203 "read": true, 00:16:32.203 "write": true, 00:16:32.203 "unmap": true, 00:16:32.203 "flush": true, 00:16:32.203 "reset": true, 00:16:32.203 "nvme_admin": false, 00:16:32.203 "nvme_io": false, 00:16:32.203 "nvme_io_md": false, 00:16:32.203 "write_zeroes": true, 00:16:32.203 "zcopy": true, 00:16:32.203 "get_zone_info": false, 00:16:32.203 "zone_management": false, 00:16:32.203 "zone_append": false, 00:16:32.203 "compare": false, 00:16:32.203 "compare_and_write": false, 00:16:32.203 "abort": true, 00:16:32.203 "seek_hole": false, 00:16:32.203 "seek_data": false, 00:16:32.203 "copy": true, 00:16:32.203 "nvme_iov_md": false 00:16:32.203 }, 00:16:32.203 "memory_domains": [ 00:16:32.203 { 00:16:32.203 "dma_device_id": "system", 00:16:32.203 "dma_device_type": 1 00:16:32.203 }, 00:16:32.203 { 00:16:32.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.203 "dma_device_type": 2 00:16:32.203 } 00:16:32.203 ], 00:16:32.203 "driver_specific": {} 00:16:32.203 } 00:16:32.203 ] 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.203 07:52:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.464 07:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.464 "name": "Existed_Raid", 00:16:32.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.464 "strip_size_kb": 64, 00:16:32.464 "state": "configuring", 00:16:32.464 "raid_level": "raid0", 00:16:32.464 "superblock": false, 00:16:32.464 "num_base_bdevs": 4, 00:16:32.464 "num_base_bdevs_discovered": 3, 00:16:32.464 "num_base_bdevs_operational": 4, 00:16:32.464 "base_bdevs_list": [ 00:16:32.464 { 00:16:32.464 "name": "BaseBdev1", 00:16:32.464 "uuid": "41a295fc-291e-4b33-af7a-870322001483", 00:16:32.464 "is_configured": true, 00:16:32.464 "data_offset": 0, 00:16:32.464 "data_size": 65536 00:16:32.464 }, 00:16:32.464 { 00:16:32.464 "name": "BaseBdev2", 00:16:32.464 "uuid": "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8", 00:16:32.464 "is_configured": true, 00:16:32.464 "data_offset": 0, 00:16:32.464 "data_size": 65536 00:16:32.464 }, 00:16:32.464 { 00:16:32.464 "name": "BaseBdev3", 00:16:32.464 "uuid": "f8ebb462-8474-4ff5-aafd-1bef6d7b0b23", 00:16:32.464 "is_configured": true, 00:16:32.464 "data_offset": 0, 00:16:32.464 "data_size": 65536 00:16:32.464 }, 00:16:32.464 { 00:16:32.464 "name": "BaseBdev4", 00:16:32.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.464 "is_configured": false, 00:16:32.464 "data_offset": 0, 00:16:32.464 "data_size": 0 00:16:32.464 } 00:16:32.464 ] 00:16:32.464 }' 00:16:32.464 07:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.464 07:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:33.042 07:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:33.310 [2024-07-15 07:52:17.876268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:33.310 [2024-07-15 07:52:17.876294] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x220afc0 00:16:33.310 [2024-07-15 07:52:17.876299] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:33.310 [2024-07-15 07:52:17.876458] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x220ac00 00:16:33.310 [2024-07-15 07:52:17.876550] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x220afc0 00:16:33.310 [2024-07-15 07:52:17.876556] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x220afc0 00:16:33.310 [2024-07-15 07:52:17.876672] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:33.310 BaseBdev4 00:16:33.310 07:52:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:33.310 07:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:33.310 07:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:33.310 07:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:33.310 07:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:33.310 07:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:33.310 07:52:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:33.569 [ 00:16:33.569 { 00:16:33.569 "name": "BaseBdev4", 00:16:33.569 "aliases": [ 00:16:33.569 "f9225e29-7b8e-4e9d-98f6-a0a18985f9d2" 00:16:33.569 ], 00:16:33.569 "product_name": "Malloc disk", 00:16:33.569 "block_size": 512, 00:16:33.569 "num_blocks": 65536, 00:16:33.569 "uuid": "f9225e29-7b8e-4e9d-98f6-a0a18985f9d2", 00:16:33.569 "assigned_rate_limits": { 00:16:33.569 "rw_ios_per_sec": 0, 00:16:33.569 "rw_mbytes_per_sec": 0, 00:16:33.569 "r_mbytes_per_sec": 0, 00:16:33.569 "w_mbytes_per_sec": 0 00:16:33.569 }, 00:16:33.569 "claimed": true, 00:16:33.569 "claim_type": "exclusive_write", 00:16:33.569 "zoned": false, 00:16:33.569 "supported_io_types": { 00:16:33.569 "read": true, 00:16:33.569 "write": true, 00:16:33.569 "unmap": true, 00:16:33.569 "flush": true, 00:16:33.569 "reset": true, 00:16:33.569 "nvme_admin": false, 00:16:33.569 "nvme_io": false, 00:16:33.569 "nvme_io_md": false, 00:16:33.569 "write_zeroes": true, 00:16:33.569 "zcopy": true, 00:16:33.569 "get_zone_info": false, 00:16:33.569 "zone_management": false, 00:16:33.569 "zone_append": false, 00:16:33.569 "compare": false, 00:16:33.569 "compare_and_write": false, 00:16:33.569 "abort": true, 00:16:33.569 "seek_hole": false, 00:16:33.569 "seek_data": false, 00:16:33.569 "copy": true, 00:16:33.569 "nvme_iov_md": false 00:16:33.569 }, 00:16:33.569 "memory_domains": [ 00:16:33.569 { 00:16:33.569 "dma_device_id": "system", 00:16:33.569 "dma_device_type": 1 00:16:33.569 }, 00:16:33.569 { 00:16:33.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.569 "dma_device_type": 2 00:16:33.569 } 00:16:33.569 ], 00:16:33.569 "driver_specific": {} 00:16:33.569 } 00:16:33.569 ] 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.569 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.829 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.829 "name": "Existed_Raid", 00:16:33.829 "uuid": "910a5553-a065-4d75-bfe4-afa8e79fbf34", 00:16:33.829 "strip_size_kb": 64, 00:16:33.829 "state": "online", 00:16:33.829 "raid_level": "raid0", 00:16:33.829 "superblock": false, 00:16:33.829 "num_base_bdevs": 4, 00:16:33.829 "num_base_bdevs_discovered": 4, 00:16:33.829 "num_base_bdevs_operational": 4, 00:16:33.829 "base_bdevs_list": [ 00:16:33.829 { 00:16:33.829 "name": "BaseBdev1", 00:16:33.829 "uuid": "41a295fc-291e-4b33-af7a-870322001483", 00:16:33.829 "is_configured": true, 00:16:33.829 "data_offset": 0, 00:16:33.829 "data_size": 65536 00:16:33.829 }, 00:16:33.829 { 00:16:33.829 "name": "BaseBdev2", 00:16:33.829 "uuid": "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8", 00:16:33.829 "is_configured": true, 00:16:33.829 "data_offset": 0, 00:16:33.829 "data_size": 65536 00:16:33.829 }, 00:16:33.829 { 00:16:33.829 "name": "BaseBdev3", 00:16:33.829 "uuid": "f8ebb462-8474-4ff5-aafd-1bef6d7b0b23", 00:16:33.829 "is_configured": true, 00:16:33.829 "data_offset": 0, 00:16:33.829 "data_size": 65536 00:16:33.829 }, 00:16:33.829 { 00:16:33.829 "name": "BaseBdev4", 00:16:33.829 "uuid": "f9225e29-7b8e-4e9d-98f6-a0a18985f9d2", 00:16:33.829 "is_configured": true, 00:16:33.829 "data_offset": 0, 00:16:33.829 "data_size": 65536 00:16:33.829 } 00:16:33.829 ] 00:16:33.830 }' 00:16:33.830 07:52:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.830 07:52:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.398 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:34.398 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:34.398 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:34.398 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:34.398 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:34.398 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:34.398 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:34.398 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:34.658 [2024-07-15 07:52:19.175834] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:34.658 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:34.658 "name": "Existed_Raid", 00:16:34.658 "aliases": [ 00:16:34.658 "910a5553-a065-4d75-bfe4-afa8e79fbf34" 00:16:34.658 ], 00:16:34.658 "product_name": "Raid Volume", 00:16:34.658 "block_size": 512, 00:16:34.658 "num_blocks": 262144, 00:16:34.658 "uuid": "910a5553-a065-4d75-bfe4-afa8e79fbf34", 00:16:34.658 "assigned_rate_limits": { 00:16:34.658 "rw_ios_per_sec": 0, 00:16:34.658 "rw_mbytes_per_sec": 0, 00:16:34.658 "r_mbytes_per_sec": 0, 00:16:34.658 "w_mbytes_per_sec": 0 00:16:34.658 }, 00:16:34.658 "claimed": false, 00:16:34.658 "zoned": false, 00:16:34.658 "supported_io_types": { 00:16:34.658 "read": true, 00:16:34.658 "write": true, 00:16:34.658 "unmap": true, 00:16:34.658 "flush": true, 00:16:34.658 "reset": true, 00:16:34.658 "nvme_admin": false, 00:16:34.658 "nvme_io": false, 00:16:34.658 "nvme_io_md": false, 00:16:34.658 "write_zeroes": true, 00:16:34.658 "zcopy": false, 00:16:34.658 "get_zone_info": false, 00:16:34.658 "zone_management": false, 00:16:34.658 "zone_append": false, 00:16:34.658 "compare": false, 00:16:34.658 "compare_and_write": false, 00:16:34.658 "abort": false, 00:16:34.658 "seek_hole": false, 00:16:34.658 "seek_data": false, 00:16:34.658 "copy": false, 00:16:34.658 "nvme_iov_md": false 00:16:34.658 }, 00:16:34.658 "memory_domains": [ 00:16:34.658 { 00:16:34.658 "dma_device_id": "system", 00:16:34.658 "dma_device_type": 1 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.658 "dma_device_type": 2 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "dma_device_id": "system", 00:16:34.658 "dma_device_type": 1 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.658 "dma_device_type": 2 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "dma_device_id": "system", 00:16:34.658 "dma_device_type": 1 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.658 "dma_device_type": 2 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "dma_device_id": "system", 00:16:34.658 "dma_device_type": 1 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.658 "dma_device_type": 2 00:16:34.658 } 00:16:34.658 ], 00:16:34.658 "driver_specific": { 00:16:34.658 "raid": { 00:16:34.658 "uuid": "910a5553-a065-4d75-bfe4-afa8e79fbf34", 00:16:34.658 "strip_size_kb": 64, 00:16:34.658 "state": "online", 00:16:34.658 "raid_level": "raid0", 00:16:34.658 "superblock": false, 00:16:34.658 "num_base_bdevs": 4, 00:16:34.658 "num_base_bdevs_discovered": 4, 00:16:34.658 "num_base_bdevs_operational": 4, 00:16:34.658 "base_bdevs_list": [ 00:16:34.658 { 00:16:34.658 "name": "BaseBdev1", 00:16:34.658 "uuid": "41a295fc-291e-4b33-af7a-870322001483", 00:16:34.658 "is_configured": true, 00:16:34.658 "data_offset": 0, 00:16:34.658 "data_size": 65536 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "name": "BaseBdev2", 00:16:34.658 "uuid": "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8", 00:16:34.658 "is_configured": true, 00:16:34.658 "data_offset": 0, 00:16:34.658 "data_size": 65536 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "name": "BaseBdev3", 00:16:34.658 "uuid": "f8ebb462-8474-4ff5-aafd-1bef6d7b0b23", 00:16:34.658 "is_configured": true, 00:16:34.658 "data_offset": 0, 00:16:34.658 "data_size": 65536 00:16:34.658 }, 00:16:34.658 { 00:16:34.658 "name": "BaseBdev4", 00:16:34.658 "uuid": "f9225e29-7b8e-4e9d-98f6-a0a18985f9d2", 00:16:34.658 "is_configured": true, 00:16:34.658 "data_offset": 0, 00:16:34.658 "data_size": 65536 00:16:34.658 } 00:16:34.658 ] 00:16:34.658 } 00:16:34.658 } 00:16:34.658 }' 00:16:34.658 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:34.658 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:34.658 BaseBdev2 00:16:34.658 BaseBdev3 00:16:34.658 BaseBdev4' 00:16:34.658 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:34.658 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:34.658 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:34.918 "name": "BaseBdev1", 00:16:34.918 "aliases": [ 00:16:34.918 "41a295fc-291e-4b33-af7a-870322001483" 00:16:34.918 ], 00:16:34.918 "product_name": "Malloc disk", 00:16:34.918 "block_size": 512, 00:16:34.918 "num_blocks": 65536, 00:16:34.918 "uuid": "41a295fc-291e-4b33-af7a-870322001483", 00:16:34.918 "assigned_rate_limits": { 00:16:34.918 "rw_ios_per_sec": 0, 00:16:34.918 "rw_mbytes_per_sec": 0, 00:16:34.918 "r_mbytes_per_sec": 0, 00:16:34.918 "w_mbytes_per_sec": 0 00:16:34.918 }, 00:16:34.918 "claimed": true, 00:16:34.918 "claim_type": "exclusive_write", 00:16:34.918 "zoned": false, 00:16:34.918 "supported_io_types": { 00:16:34.918 "read": true, 00:16:34.918 "write": true, 00:16:34.918 "unmap": true, 00:16:34.918 "flush": true, 00:16:34.918 "reset": true, 00:16:34.918 "nvme_admin": false, 00:16:34.918 "nvme_io": false, 00:16:34.918 "nvme_io_md": false, 00:16:34.918 "write_zeroes": true, 00:16:34.918 "zcopy": true, 00:16:34.918 "get_zone_info": false, 00:16:34.918 "zone_management": false, 00:16:34.918 "zone_append": false, 00:16:34.918 "compare": false, 00:16:34.918 "compare_and_write": false, 00:16:34.918 "abort": true, 00:16:34.918 "seek_hole": false, 00:16:34.918 "seek_data": false, 00:16:34.918 "copy": true, 00:16:34.918 "nvme_iov_md": false 00:16:34.918 }, 00:16:34.918 "memory_domains": [ 00:16:34.918 { 00:16:34.918 "dma_device_id": "system", 00:16:34.918 "dma_device_type": 1 00:16:34.918 }, 00:16:34.918 { 00:16:34.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.918 "dma_device_type": 2 00:16:34.918 } 00:16:34.918 ], 00:16:34.918 "driver_specific": {} 00:16:34.918 }' 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.918 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.178 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:35.178 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.178 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.178 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:35.178 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.178 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:35.178 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:35.439 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:35.439 "name": "BaseBdev2", 00:16:35.439 "aliases": [ 00:16:35.439 "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8" 00:16:35.439 ], 00:16:35.439 "product_name": "Malloc disk", 00:16:35.439 "block_size": 512, 00:16:35.439 "num_blocks": 65536, 00:16:35.439 "uuid": "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8", 00:16:35.439 "assigned_rate_limits": { 00:16:35.439 "rw_ios_per_sec": 0, 00:16:35.439 "rw_mbytes_per_sec": 0, 00:16:35.439 "r_mbytes_per_sec": 0, 00:16:35.439 "w_mbytes_per_sec": 0 00:16:35.439 }, 00:16:35.439 "claimed": true, 00:16:35.439 "claim_type": "exclusive_write", 00:16:35.439 "zoned": false, 00:16:35.439 "supported_io_types": { 00:16:35.439 "read": true, 00:16:35.439 "write": true, 00:16:35.439 "unmap": true, 00:16:35.439 "flush": true, 00:16:35.439 "reset": true, 00:16:35.439 "nvme_admin": false, 00:16:35.439 "nvme_io": false, 00:16:35.439 "nvme_io_md": false, 00:16:35.439 "write_zeroes": true, 00:16:35.439 "zcopy": true, 00:16:35.439 "get_zone_info": false, 00:16:35.439 "zone_management": false, 00:16:35.439 "zone_append": false, 00:16:35.439 "compare": false, 00:16:35.439 "compare_and_write": false, 00:16:35.439 "abort": true, 00:16:35.439 "seek_hole": false, 00:16:35.439 "seek_data": false, 00:16:35.439 "copy": true, 00:16:35.439 "nvme_iov_md": false 00:16:35.439 }, 00:16:35.439 "memory_domains": [ 00:16:35.439 { 00:16:35.439 "dma_device_id": "system", 00:16:35.439 "dma_device_type": 1 00:16:35.439 }, 00:16:35.439 { 00:16:35.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.439 "dma_device_type": 2 00:16:35.439 } 00:16:35.439 ], 00:16:35.439 "driver_specific": {} 00:16:35.439 }' 00:16:35.439 07:52:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.439 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.439 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:35.439 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.439 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.439 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:35.439 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.699 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.699 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:35.699 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.699 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.699 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:35.699 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.699 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:35.699 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:35.959 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:35.959 "name": "BaseBdev3", 00:16:35.959 "aliases": [ 00:16:35.959 "f8ebb462-8474-4ff5-aafd-1bef6d7b0b23" 00:16:35.959 ], 00:16:35.959 "product_name": "Malloc disk", 00:16:35.959 "block_size": 512, 00:16:35.959 "num_blocks": 65536, 00:16:35.959 "uuid": "f8ebb462-8474-4ff5-aafd-1bef6d7b0b23", 00:16:35.959 "assigned_rate_limits": { 00:16:35.959 "rw_ios_per_sec": 0, 00:16:35.959 "rw_mbytes_per_sec": 0, 00:16:35.959 "r_mbytes_per_sec": 0, 00:16:35.959 "w_mbytes_per_sec": 0 00:16:35.959 }, 00:16:35.959 "claimed": true, 00:16:35.959 "claim_type": "exclusive_write", 00:16:35.959 "zoned": false, 00:16:35.959 "supported_io_types": { 00:16:35.959 "read": true, 00:16:35.959 "write": true, 00:16:35.959 "unmap": true, 00:16:35.959 "flush": true, 00:16:35.959 "reset": true, 00:16:35.959 "nvme_admin": false, 00:16:35.959 "nvme_io": false, 00:16:35.959 "nvme_io_md": false, 00:16:35.959 "write_zeroes": true, 00:16:35.959 "zcopy": true, 00:16:35.959 "get_zone_info": false, 00:16:35.959 "zone_management": false, 00:16:35.959 "zone_append": false, 00:16:35.959 "compare": false, 00:16:35.959 "compare_and_write": false, 00:16:35.959 "abort": true, 00:16:35.959 "seek_hole": false, 00:16:35.959 "seek_data": false, 00:16:35.959 "copy": true, 00:16:35.959 "nvme_iov_md": false 00:16:35.959 }, 00:16:35.959 "memory_domains": [ 00:16:35.959 { 00:16:35.959 "dma_device_id": "system", 00:16:35.959 "dma_device_type": 1 00:16:35.959 }, 00:16:35.959 { 00:16:35.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.959 "dma_device_type": 2 00:16:35.959 } 00:16:35.959 ], 00:16:35.959 "driver_specific": {} 00:16:35.959 }' 00:16:35.959 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.959 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.959 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:35.959 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.959 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.959 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:35.959 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.219 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.219 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:36.219 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.219 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.219 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:36.219 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.219 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:36.219 07:52:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.480 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.480 "name": "BaseBdev4", 00:16:36.480 "aliases": [ 00:16:36.480 "f9225e29-7b8e-4e9d-98f6-a0a18985f9d2" 00:16:36.480 ], 00:16:36.480 "product_name": "Malloc disk", 00:16:36.480 "block_size": 512, 00:16:36.480 "num_blocks": 65536, 00:16:36.480 "uuid": "f9225e29-7b8e-4e9d-98f6-a0a18985f9d2", 00:16:36.480 "assigned_rate_limits": { 00:16:36.480 "rw_ios_per_sec": 0, 00:16:36.480 "rw_mbytes_per_sec": 0, 00:16:36.480 "r_mbytes_per_sec": 0, 00:16:36.480 "w_mbytes_per_sec": 0 00:16:36.480 }, 00:16:36.480 "claimed": true, 00:16:36.480 "claim_type": "exclusive_write", 00:16:36.480 "zoned": false, 00:16:36.480 "supported_io_types": { 00:16:36.480 "read": true, 00:16:36.480 "write": true, 00:16:36.480 "unmap": true, 00:16:36.480 "flush": true, 00:16:36.480 "reset": true, 00:16:36.480 "nvme_admin": false, 00:16:36.480 "nvme_io": false, 00:16:36.480 "nvme_io_md": false, 00:16:36.480 "write_zeroes": true, 00:16:36.480 "zcopy": true, 00:16:36.480 "get_zone_info": false, 00:16:36.480 "zone_management": false, 00:16:36.480 "zone_append": false, 00:16:36.480 "compare": false, 00:16:36.480 "compare_and_write": false, 00:16:36.480 "abort": true, 00:16:36.480 "seek_hole": false, 00:16:36.480 "seek_data": false, 00:16:36.480 "copy": true, 00:16:36.480 "nvme_iov_md": false 00:16:36.480 }, 00:16:36.480 "memory_domains": [ 00:16:36.480 { 00:16:36.480 "dma_device_id": "system", 00:16:36.480 "dma_device_type": 1 00:16:36.480 }, 00:16:36.480 { 00:16:36.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.480 "dma_device_type": 2 00:16:36.480 } 00:16:36.480 ], 00:16:36.480 "driver_specific": {} 00:16:36.480 }' 00:16:36.480 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.480 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.480 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:36.480 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.480 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.480 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:36.480 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.740 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.740 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:36.740 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.740 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.740 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:36.740 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:37.000 [2024-07-15 07:52:21.549628] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:37.000 [2024-07-15 07:52:21.549648] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:37.000 [2024-07-15 07:52:21.549684] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.000 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.261 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.261 "name": "Existed_Raid", 00:16:37.261 "uuid": "910a5553-a065-4d75-bfe4-afa8e79fbf34", 00:16:37.261 "strip_size_kb": 64, 00:16:37.261 "state": "offline", 00:16:37.261 "raid_level": "raid0", 00:16:37.261 "superblock": false, 00:16:37.261 "num_base_bdevs": 4, 00:16:37.261 "num_base_bdevs_discovered": 3, 00:16:37.261 "num_base_bdevs_operational": 3, 00:16:37.261 "base_bdevs_list": [ 00:16:37.261 { 00:16:37.261 "name": null, 00:16:37.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.261 "is_configured": false, 00:16:37.261 "data_offset": 0, 00:16:37.261 "data_size": 65536 00:16:37.261 }, 00:16:37.261 { 00:16:37.261 "name": "BaseBdev2", 00:16:37.261 "uuid": "80246ea7-b9e5-4d6d-930e-f30c08cb9fa8", 00:16:37.261 "is_configured": true, 00:16:37.261 "data_offset": 0, 00:16:37.261 "data_size": 65536 00:16:37.261 }, 00:16:37.261 { 00:16:37.261 "name": "BaseBdev3", 00:16:37.261 "uuid": "f8ebb462-8474-4ff5-aafd-1bef6d7b0b23", 00:16:37.261 "is_configured": true, 00:16:37.261 "data_offset": 0, 00:16:37.261 "data_size": 65536 00:16:37.261 }, 00:16:37.261 { 00:16:37.261 "name": "BaseBdev4", 00:16:37.261 "uuid": "f9225e29-7b8e-4e9d-98f6-a0a18985f9d2", 00:16:37.261 "is_configured": true, 00:16:37.261 "data_offset": 0, 00:16:37.261 "data_size": 65536 00:16:37.261 } 00:16:37.261 ] 00:16:37.261 }' 00:16:37.261 07:52:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.261 07:52:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.832 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:37.832 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:37.832 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.832 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:37.832 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:37.832 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:37.832 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:38.093 [2024-07-15 07:52:22.660415] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:38.094 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:38.094 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.094 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.094 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:38.354 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:38.354 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:38.354 07:52:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:38.354 [2024-07-15 07:52:23.043193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:38.354 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:38.354 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.354 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.354 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:38.638 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:38.638 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:38.638 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:38.901 [2024-07-15 07:52:23.409913] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:38.901 [2024-07-15 07:52:23.409946] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x220afc0 name Existed_Raid, state offline 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:38.901 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:39.161 BaseBdev2 00:16:39.161 07:52:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:39.161 07:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:39.161 07:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:39.161 07:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:39.161 07:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:39.161 07:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:39.161 07:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:39.421 07:52:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:39.421 [ 00:16:39.421 { 00:16:39.421 "name": "BaseBdev2", 00:16:39.421 "aliases": [ 00:16:39.421 "1472d416-2238-4267-b12a-0424edeccb16" 00:16:39.421 ], 00:16:39.421 "product_name": "Malloc disk", 00:16:39.421 "block_size": 512, 00:16:39.421 "num_blocks": 65536, 00:16:39.421 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:39.421 "assigned_rate_limits": { 00:16:39.421 "rw_ios_per_sec": 0, 00:16:39.421 "rw_mbytes_per_sec": 0, 00:16:39.421 "r_mbytes_per_sec": 0, 00:16:39.421 "w_mbytes_per_sec": 0 00:16:39.421 }, 00:16:39.421 "claimed": false, 00:16:39.421 "zoned": false, 00:16:39.421 "supported_io_types": { 00:16:39.421 "read": true, 00:16:39.421 "write": true, 00:16:39.421 "unmap": true, 00:16:39.421 "flush": true, 00:16:39.421 "reset": true, 00:16:39.421 "nvme_admin": false, 00:16:39.421 "nvme_io": false, 00:16:39.421 "nvme_io_md": false, 00:16:39.421 "write_zeroes": true, 00:16:39.421 "zcopy": true, 00:16:39.421 "get_zone_info": false, 00:16:39.421 "zone_management": false, 00:16:39.421 "zone_append": false, 00:16:39.421 "compare": false, 00:16:39.421 "compare_and_write": false, 00:16:39.421 "abort": true, 00:16:39.421 "seek_hole": false, 00:16:39.421 "seek_data": false, 00:16:39.421 "copy": true, 00:16:39.421 "nvme_iov_md": false 00:16:39.421 }, 00:16:39.421 "memory_domains": [ 00:16:39.421 { 00:16:39.421 "dma_device_id": "system", 00:16:39.421 "dma_device_type": 1 00:16:39.421 }, 00:16:39.421 { 00:16:39.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.421 "dma_device_type": 2 00:16:39.421 } 00:16:39.421 ], 00:16:39.421 "driver_specific": {} 00:16:39.421 } 00:16:39.421 ] 00:16:39.422 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:39.422 07:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:39.422 07:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:39.422 07:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:39.681 BaseBdev3 00:16:39.681 07:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:39.681 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:39.681 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:39.681 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:39.681 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:39.681 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:39.681 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:39.941 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:40.201 [ 00:16:40.201 { 00:16:40.201 "name": "BaseBdev3", 00:16:40.201 "aliases": [ 00:16:40.201 "f2c66c58-53a6-4a84-b393-15265c107e1e" 00:16:40.201 ], 00:16:40.201 "product_name": "Malloc disk", 00:16:40.201 "block_size": 512, 00:16:40.201 "num_blocks": 65536, 00:16:40.201 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:40.201 "assigned_rate_limits": { 00:16:40.201 "rw_ios_per_sec": 0, 00:16:40.201 "rw_mbytes_per_sec": 0, 00:16:40.201 "r_mbytes_per_sec": 0, 00:16:40.201 "w_mbytes_per_sec": 0 00:16:40.201 }, 00:16:40.201 "claimed": false, 00:16:40.201 "zoned": false, 00:16:40.202 "supported_io_types": { 00:16:40.202 "read": true, 00:16:40.202 "write": true, 00:16:40.202 "unmap": true, 00:16:40.202 "flush": true, 00:16:40.202 "reset": true, 00:16:40.202 "nvme_admin": false, 00:16:40.202 "nvme_io": false, 00:16:40.202 "nvme_io_md": false, 00:16:40.202 "write_zeroes": true, 00:16:40.202 "zcopy": true, 00:16:40.202 "get_zone_info": false, 00:16:40.202 "zone_management": false, 00:16:40.202 "zone_append": false, 00:16:40.202 "compare": false, 00:16:40.202 "compare_and_write": false, 00:16:40.202 "abort": true, 00:16:40.202 "seek_hole": false, 00:16:40.202 "seek_data": false, 00:16:40.202 "copy": true, 00:16:40.202 "nvme_iov_md": false 00:16:40.202 }, 00:16:40.202 "memory_domains": [ 00:16:40.202 { 00:16:40.202 "dma_device_id": "system", 00:16:40.202 "dma_device_type": 1 00:16:40.202 }, 00:16:40.202 { 00:16:40.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.202 "dma_device_type": 2 00:16:40.202 } 00:16:40.202 ], 00:16:40.202 "driver_specific": {} 00:16:40.202 } 00:16:40.202 ] 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:40.202 BaseBdev4 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:40.202 07:52:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:40.462 07:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:40.722 [ 00:16:40.722 { 00:16:40.722 "name": "BaseBdev4", 00:16:40.722 "aliases": [ 00:16:40.722 "944301e6-1859-4992-89fb-e3ed8a2c8384" 00:16:40.722 ], 00:16:40.722 "product_name": "Malloc disk", 00:16:40.722 "block_size": 512, 00:16:40.722 "num_blocks": 65536, 00:16:40.722 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:40.722 "assigned_rate_limits": { 00:16:40.722 "rw_ios_per_sec": 0, 00:16:40.722 "rw_mbytes_per_sec": 0, 00:16:40.722 "r_mbytes_per_sec": 0, 00:16:40.722 "w_mbytes_per_sec": 0 00:16:40.722 }, 00:16:40.722 "claimed": false, 00:16:40.722 "zoned": false, 00:16:40.722 "supported_io_types": { 00:16:40.722 "read": true, 00:16:40.722 "write": true, 00:16:40.722 "unmap": true, 00:16:40.722 "flush": true, 00:16:40.722 "reset": true, 00:16:40.722 "nvme_admin": false, 00:16:40.722 "nvme_io": false, 00:16:40.722 "nvme_io_md": false, 00:16:40.722 "write_zeroes": true, 00:16:40.722 "zcopy": true, 00:16:40.722 "get_zone_info": false, 00:16:40.722 "zone_management": false, 00:16:40.722 "zone_append": false, 00:16:40.722 "compare": false, 00:16:40.722 "compare_and_write": false, 00:16:40.722 "abort": true, 00:16:40.722 "seek_hole": false, 00:16:40.722 "seek_data": false, 00:16:40.722 "copy": true, 00:16:40.722 "nvme_iov_md": false 00:16:40.722 }, 00:16:40.722 "memory_domains": [ 00:16:40.722 { 00:16:40.722 "dma_device_id": "system", 00:16:40.722 "dma_device_type": 1 00:16:40.722 }, 00:16:40.722 { 00:16:40.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.722 "dma_device_type": 2 00:16:40.722 } 00:16:40.722 ], 00:16:40.722 "driver_specific": {} 00:16:40.722 } 00:16:40.722 ] 00:16:40.722 07:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:40.722 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:40.722 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:40.722 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:40.722 [2024-07-15 07:52:25.473219] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:40.723 [2024-07-15 07:52:25.473249] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:40.723 [2024-07-15 07:52:25.473263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:40.723 [2024-07-15 07:52:25.474296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:40.723 [2024-07-15 07:52:25.474328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.983 "name": "Existed_Raid", 00:16:40.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.983 "strip_size_kb": 64, 00:16:40.983 "state": "configuring", 00:16:40.983 "raid_level": "raid0", 00:16:40.983 "superblock": false, 00:16:40.983 "num_base_bdevs": 4, 00:16:40.983 "num_base_bdevs_discovered": 3, 00:16:40.983 "num_base_bdevs_operational": 4, 00:16:40.983 "base_bdevs_list": [ 00:16:40.983 { 00:16:40.983 "name": "BaseBdev1", 00:16:40.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.983 "is_configured": false, 00:16:40.983 "data_offset": 0, 00:16:40.983 "data_size": 0 00:16:40.983 }, 00:16:40.983 { 00:16:40.983 "name": "BaseBdev2", 00:16:40.983 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:40.983 "is_configured": true, 00:16:40.983 "data_offset": 0, 00:16:40.983 "data_size": 65536 00:16:40.983 }, 00:16:40.983 { 00:16:40.983 "name": "BaseBdev3", 00:16:40.983 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:40.983 "is_configured": true, 00:16:40.983 "data_offset": 0, 00:16:40.983 "data_size": 65536 00:16:40.983 }, 00:16:40.983 { 00:16:40.983 "name": "BaseBdev4", 00:16:40.983 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:40.983 "is_configured": true, 00:16:40.983 "data_offset": 0, 00:16:40.983 "data_size": 65536 00:16:40.983 } 00:16:40.983 ] 00:16:40.983 }' 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.983 07:52:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.554 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:41.820 [2024-07-15 07:52:26.371475] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.820 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.131 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.131 "name": "Existed_Raid", 00:16:42.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.131 "strip_size_kb": 64, 00:16:42.131 "state": "configuring", 00:16:42.131 "raid_level": "raid0", 00:16:42.131 "superblock": false, 00:16:42.131 "num_base_bdevs": 4, 00:16:42.131 "num_base_bdevs_discovered": 2, 00:16:42.131 "num_base_bdevs_operational": 4, 00:16:42.131 "base_bdevs_list": [ 00:16:42.131 { 00:16:42.131 "name": "BaseBdev1", 00:16:42.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.131 "is_configured": false, 00:16:42.131 "data_offset": 0, 00:16:42.131 "data_size": 0 00:16:42.131 }, 00:16:42.131 { 00:16:42.131 "name": null, 00:16:42.131 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:42.131 "is_configured": false, 00:16:42.131 "data_offset": 0, 00:16:42.131 "data_size": 65536 00:16:42.131 }, 00:16:42.131 { 00:16:42.131 "name": "BaseBdev3", 00:16:42.131 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:42.131 "is_configured": true, 00:16:42.131 "data_offset": 0, 00:16:42.131 "data_size": 65536 00:16:42.131 }, 00:16:42.131 { 00:16:42.131 "name": "BaseBdev4", 00:16:42.131 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:42.131 "is_configured": true, 00:16:42.131 "data_offset": 0, 00:16:42.131 "data_size": 65536 00:16:42.131 } 00:16:42.131 ] 00:16:42.131 }' 00:16:42.131 07:52:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.131 07:52:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.391 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.391 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:42.653 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:42.653 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:42.912 [2024-07-15 07:52:27.499387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:42.912 BaseBdev1 00:16:42.912 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:42.912 07:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:42.912 07:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:42.912 07:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:42.912 07:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:42.912 07:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:42.912 07:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:43.172 [ 00:16:43.172 { 00:16:43.172 "name": "BaseBdev1", 00:16:43.172 "aliases": [ 00:16:43.172 "9015183e-c0f0-4024-8e93-3fd29500f7f7" 00:16:43.172 ], 00:16:43.172 "product_name": "Malloc disk", 00:16:43.172 "block_size": 512, 00:16:43.172 "num_blocks": 65536, 00:16:43.172 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:43.172 "assigned_rate_limits": { 00:16:43.172 "rw_ios_per_sec": 0, 00:16:43.172 "rw_mbytes_per_sec": 0, 00:16:43.172 "r_mbytes_per_sec": 0, 00:16:43.172 "w_mbytes_per_sec": 0 00:16:43.172 }, 00:16:43.172 "claimed": true, 00:16:43.172 "claim_type": "exclusive_write", 00:16:43.172 "zoned": false, 00:16:43.172 "supported_io_types": { 00:16:43.172 "read": true, 00:16:43.172 "write": true, 00:16:43.172 "unmap": true, 00:16:43.172 "flush": true, 00:16:43.172 "reset": true, 00:16:43.172 "nvme_admin": false, 00:16:43.172 "nvme_io": false, 00:16:43.172 "nvme_io_md": false, 00:16:43.172 "write_zeroes": true, 00:16:43.172 "zcopy": true, 00:16:43.172 "get_zone_info": false, 00:16:43.172 "zone_management": false, 00:16:43.172 "zone_append": false, 00:16:43.172 "compare": false, 00:16:43.172 "compare_and_write": false, 00:16:43.172 "abort": true, 00:16:43.172 "seek_hole": false, 00:16:43.172 "seek_data": false, 00:16:43.172 "copy": true, 00:16:43.172 "nvme_iov_md": false 00:16:43.172 }, 00:16:43.172 "memory_domains": [ 00:16:43.172 { 00:16:43.172 "dma_device_id": "system", 00:16:43.172 "dma_device_type": 1 00:16:43.172 }, 00:16:43.172 { 00:16:43.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.172 "dma_device_type": 2 00:16:43.172 } 00:16:43.172 ], 00:16:43.172 "driver_specific": {} 00:16:43.172 } 00:16:43.172 ] 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.172 07:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.433 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.433 "name": "Existed_Raid", 00:16:43.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.433 "strip_size_kb": 64, 00:16:43.433 "state": "configuring", 00:16:43.433 "raid_level": "raid0", 00:16:43.433 "superblock": false, 00:16:43.433 "num_base_bdevs": 4, 00:16:43.433 "num_base_bdevs_discovered": 3, 00:16:43.433 "num_base_bdevs_operational": 4, 00:16:43.433 "base_bdevs_list": [ 00:16:43.433 { 00:16:43.433 "name": "BaseBdev1", 00:16:43.433 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:43.433 "is_configured": true, 00:16:43.433 "data_offset": 0, 00:16:43.433 "data_size": 65536 00:16:43.433 }, 00:16:43.433 { 00:16:43.433 "name": null, 00:16:43.433 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:43.433 "is_configured": false, 00:16:43.433 "data_offset": 0, 00:16:43.433 "data_size": 65536 00:16:43.433 }, 00:16:43.433 { 00:16:43.433 "name": "BaseBdev3", 00:16:43.433 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:43.433 "is_configured": true, 00:16:43.433 "data_offset": 0, 00:16:43.433 "data_size": 65536 00:16:43.433 }, 00:16:43.433 { 00:16:43.433 "name": "BaseBdev4", 00:16:43.433 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:43.433 "is_configured": true, 00:16:43.433 "data_offset": 0, 00:16:43.433 "data_size": 65536 00:16:43.433 } 00:16:43.433 ] 00:16:43.433 }' 00:16:43.433 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.433 07:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.003 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.003 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:44.263 [2024-07-15 07:52:28.979148] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.263 07:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.523 07:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.523 "name": "Existed_Raid", 00:16:44.523 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.523 "strip_size_kb": 64, 00:16:44.523 "state": "configuring", 00:16:44.523 "raid_level": "raid0", 00:16:44.523 "superblock": false, 00:16:44.523 "num_base_bdevs": 4, 00:16:44.523 "num_base_bdevs_discovered": 2, 00:16:44.523 "num_base_bdevs_operational": 4, 00:16:44.523 "base_bdevs_list": [ 00:16:44.523 { 00:16:44.523 "name": "BaseBdev1", 00:16:44.523 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:44.523 "is_configured": true, 00:16:44.523 "data_offset": 0, 00:16:44.523 "data_size": 65536 00:16:44.523 }, 00:16:44.523 { 00:16:44.523 "name": null, 00:16:44.523 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:44.523 "is_configured": false, 00:16:44.523 "data_offset": 0, 00:16:44.523 "data_size": 65536 00:16:44.523 }, 00:16:44.523 { 00:16:44.523 "name": null, 00:16:44.523 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:44.523 "is_configured": false, 00:16:44.523 "data_offset": 0, 00:16:44.523 "data_size": 65536 00:16:44.523 }, 00:16:44.523 { 00:16:44.523 "name": "BaseBdev4", 00:16:44.523 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:44.523 "is_configured": true, 00:16:44.523 "data_offset": 0, 00:16:44.523 "data_size": 65536 00:16:44.523 } 00:16:44.523 ] 00:16:44.523 }' 00:16:44.523 07:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.523 07:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.093 07:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.093 07:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:45.353 07:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:45.353 07:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:45.613 [2024-07-15 07:52:30.110031] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.613 "name": "Existed_Raid", 00:16:45.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.613 "strip_size_kb": 64, 00:16:45.613 "state": "configuring", 00:16:45.613 "raid_level": "raid0", 00:16:45.613 "superblock": false, 00:16:45.613 "num_base_bdevs": 4, 00:16:45.613 "num_base_bdevs_discovered": 3, 00:16:45.613 "num_base_bdevs_operational": 4, 00:16:45.613 "base_bdevs_list": [ 00:16:45.613 { 00:16:45.613 "name": "BaseBdev1", 00:16:45.613 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:45.613 "is_configured": true, 00:16:45.613 "data_offset": 0, 00:16:45.613 "data_size": 65536 00:16:45.613 }, 00:16:45.613 { 00:16:45.613 "name": null, 00:16:45.613 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:45.613 "is_configured": false, 00:16:45.613 "data_offset": 0, 00:16:45.613 "data_size": 65536 00:16:45.613 }, 00:16:45.613 { 00:16:45.613 "name": "BaseBdev3", 00:16:45.613 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:45.613 "is_configured": true, 00:16:45.613 "data_offset": 0, 00:16:45.613 "data_size": 65536 00:16:45.613 }, 00:16:45.613 { 00:16:45.613 "name": "BaseBdev4", 00:16:45.613 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:45.613 "is_configured": true, 00:16:45.613 "data_offset": 0, 00:16:45.613 "data_size": 65536 00:16:45.613 } 00:16:45.613 ] 00:16:45.613 }' 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.613 07:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.184 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.184 07:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:46.444 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:46.444 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:46.704 [2024-07-15 07:52:31.216829] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.704 "name": "Existed_Raid", 00:16:46.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.704 "strip_size_kb": 64, 00:16:46.704 "state": "configuring", 00:16:46.704 "raid_level": "raid0", 00:16:46.704 "superblock": false, 00:16:46.704 "num_base_bdevs": 4, 00:16:46.704 "num_base_bdevs_discovered": 2, 00:16:46.704 "num_base_bdevs_operational": 4, 00:16:46.704 "base_bdevs_list": [ 00:16:46.704 { 00:16:46.704 "name": null, 00:16:46.704 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:46.704 "is_configured": false, 00:16:46.704 "data_offset": 0, 00:16:46.704 "data_size": 65536 00:16:46.704 }, 00:16:46.704 { 00:16:46.704 "name": null, 00:16:46.704 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:46.704 "is_configured": false, 00:16:46.704 "data_offset": 0, 00:16:46.704 "data_size": 65536 00:16:46.704 }, 00:16:46.704 { 00:16:46.704 "name": "BaseBdev3", 00:16:46.704 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:46.704 "is_configured": true, 00:16:46.704 "data_offset": 0, 00:16:46.704 "data_size": 65536 00:16:46.704 }, 00:16:46.704 { 00:16:46.704 "name": "BaseBdev4", 00:16:46.704 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:46.704 "is_configured": true, 00:16:46.704 "data_offset": 0, 00:16:46.704 "data_size": 65536 00:16:46.704 } 00:16:46.704 ] 00:16:46.704 }' 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.704 07:52:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.275 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.275 07:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:47.535 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:47.535 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:47.795 [2024-07-15 07:52:32.357518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.795 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.057 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.057 "name": "Existed_Raid", 00:16:48.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.057 "strip_size_kb": 64, 00:16:48.057 "state": "configuring", 00:16:48.057 "raid_level": "raid0", 00:16:48.057 "superblock": false, 00:16:48.057 "num_base_bdevs": 4, 00:16:48.057 "num_base_bdevs_discovered": 3, 00:16:48.057 "num_base_bdevs_operational": 4, 00:16:48.057 "base_bdevs_list": [ 00:16:48.057 { 00:16:48.057 "name": null, 00:16:48.057 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:48.057 "is_configured": false, 00:16:48.057 "data_offset": 0, 00:16:48.057 "data_size": 65536 00:16:48.057 }, 00:16:48.057 { 00:16:48.057 "name": "BaseBdev2", 00:16:48.057 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:48.057 "is_configured": true, 00:16:48.057 "data_offset": 0, 00:16:48.057 "data_size": 65536 00:16:48.057 }, 00:16:48.057 { 00:16:48.057 "name": "BaseBdev3", 00:16:48.057 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:48.057 "is_configured": true, 00:16:48.057 "data_offset": 0, 00:16:48.057 "data_size": 65536 00:16:48.057 }, 00:16:48.057 { 00:16:48.057 "name": "BaseBdev4", 00:16:48.057 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:48.057 "is_configured": true, 00:16:48.057 "data_offset": 0, 00:16:48.057 "data_size": 65536 00:16:48.057 } 00:16:48.057 ] 00:16:48.057 }' 00:16:48.057 07:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.057 07:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.627 07:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.627 07:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:48.627 07:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:48.627 07:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.627 07:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:48.887 07:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9015183e-c0f0-4024-8e93-3fd29500f7f7 00:16:48.888 [2024-07-15 07:52:33.641626] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:48.888 [2024-07-15 07:52:33.641649] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x220aba0 00:16:48.888 [2024-07-15 07:52:33.641658] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:48.888 [2024-07-15 07:52:33.641814] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x220fbd0 00:16:48.888 [2024-07-15 07:52:33.641906] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x220aba0 00:16:48.888 [2024-07-15 07:52:33.641911] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x220aba0 00:16:48.888 [2024-07-15 07:52:33.642028] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:48.888 NewBaseBdev 00:16:49.147 07:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:49.147 07:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:49.147 07:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:49.147 07:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:49.147 07:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:49.147 07:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:49.147 07:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:49.147 07:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:49.406 [ 00:16:49.406 { 00:16:49.406 "name": "NewBaseBdev", 00:16:49.406 "aliases": [ 00:16:49.406 "9015183e-c0f0-4024-8e93-3fd29500f7f7" 00:16:49.406 ], 00:16:49.406 "product_name": "Malloc disk", 00:16:49.406 "block_size": 512, 00:16:49.406 "num_blocks": 65536, 00:16:49.406 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:49.406 "assigned_rate_limits": { 00:16:49.406 "rw_ios_per_sec": 0, 00:16:49.406 "rw_mbytes_per_sec": 0, 00:16:49.406 "r_mbytes_per_sec": 0, 00:16:49.406 "w_mbytes_per_sec": 0 00:16:49.406 }, 00:16:49.406 "claimed": true, 00:16:49.406 "claim_type": "exclusive_write", 00:16:49.406 "zoned": false, 00:16:49.406 "supported_io_types": { 00:16:49.406 "read": true, 00:16:49.406 "write": true, 00:16:49.406 "unmap": true, 00:16:49.406 "flush": true, 00:16:49.406 "reset": true, 00:16:49.406 "nvme_admin": false, 00:16:49.406 "nvme_io": false, 00:16:49.406 "nvme_io_md": false, 00:16:49.406 "write_zeroes": true, 00:16:49.406 "zcopy": true, 00:16:49.406 "get_zone_info": false, 00:16:49.406 "zone_management": false, 00:16:49.406 "zone_append": false, 00:16:49.406 "compare": false, 00:16:49.406 "compare_and_write": false, 00:16:49.406 "abort": true, 00:16:49.406 "seek_hole": false, 00:16:49.406 "seek_data": false, 00:16:49.406 "copy": true, 00:16:49.406 "nvme_iov_md": false 00:16:49.406 }, 00:16:49.406 "memory_domains": [ 00:16:49.406 { 00:16:49.406 "dma_device_id": "system", 00:16:49.406 "dma_device_type": 1 00:16:49.406 }, 00:16:49.406 { 00:16:49.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.406 "dma_device_type": 2 00:16:49.406 } 00:16:49.406 ], 00:16:49.406 "driver_specific": {} 00:16:49.406 } 00:16:49.406 ] 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.406 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.666 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.666 "name": "Existed_Raid", 00:16:49.666 "uuid": "5d5d61db-336e-4806-a99a-02e76fee950e", 00:16:49.666 "strip_size_kb": 64, 00:16:49.666 "state": "online", 00:16:49.666 "raid_level": "raid0", 00:16:49.666 "superblock": false, 00:16:49.666 "num_base_bdevs": 4, 00:16:49.666 "num_base_bdevs_discovered": 4, 00:16:49.666 "num_base_bdevs_operational": 4, 00:16:49.666 "base_bdevs_list": [ 00:16:49.666 { 00:16:49.666 "name": "NewBaseBdev", 00:16:49.666 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:49.666 "is_configured": true, 00:16:49.666 "data_offset": 0, 00:16:49.666 "data_size": 65536 00:16:49.666 }, 00:16:49.666 { 00:16:49.666 "name": "BaseBdev2", 00:16:49.666 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:49.666 "is_configured": true, 00:16:49.666 "data_offset": 0, 00:16:49.666 "data_size": 65536 00:16:49.666 }, 00:16:49.666 { 00:16:49.666 "name": "BaseBdev3", 00:16:49.666 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:49.666 "is_configured": true, 00:16:49.666 "data_offset": 0, 00:16:49.666 "data_size": 65536 00:16:49.666 }, 00:16:49.666 { 00:16:49.666 "name": "BaseBdev4", 00:16:49.666 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:49.666 "is_configured": true, 00:16:49.666 "data_offset": 0, 00:16:49.666 "data_size": 65536 00:16:49.666 } 00:16:49.666 ] 00:16:49.666 }' 00:16:49.666 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.666 07:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:50.234 [2024-07-15 07:52:34.925145] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:50.234 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:50.234 "name": "Existed_Raid", 00:16:50.234 "aliases": [ 00:16:50.234 "5d5d61db-336e-4806-a99a-02e76fee950e" 00:16:50.234 ], 00:16:50.234 "product_name": "Raid Volume", 00:16:50.234 "block_size": 512, 00:16:50.234 "num_blocks": 262144, 00:16:50.234 "uuid": "5d5d61db-336e-4806-a99a-02e76fee950e", 00:16:50.234 "assigned_rate_limits": { 00:16:50.234 "rw_ios_per_sec": 0, 00:16:50.234 "rw_mbytes_per_sec": 0, 00:16:50.234 "r_mbytes_per_sec": 0, 00:16:50.234 "w_mbytes_per_sec": 0 00:16:50.234 }, 00:16:50.234 "claimed": false, 00:16:50.234 "zoned": false, 00:16:50.234 "supported_io_types": { 00:16:50.234 "read": true, 00:16:50.234 "write": true, 00:16:50.234 "unmap": true, 00:16:50.234 "flush": true, 00:16:50.234 "reset": true, 00:16:50.234 "nvme_admin": false, 00:16:50.234 "nvme_io": false, 00:16:50.234 "nvme_io_md": false, 00:16:50.234 "write_zeroes": true, 00:16:50.234 "zcopy": false, 00:16:50.234 "get_zone_info": false, 00:16:50.234 "zone_management": false, 00:16:50.234 "zone_append": false, 00:16:50.234 "compare": false, 00:16:50.234 "compare_and_write": false, 00:16:50.234 "abort": false, 00:16:50.234 "seek_hole": false, 00:16:50.234 "seek_data": false, 00:16:50.234 "copy": false, 00:16:50.234 "nvme_iov_md": false 00:16:50.234 }, 00:16:50.234 "memory_domains": [ 00:16:50.234 { 00:16:50.234 "dma_device_id": "system", 00:16:50.234 "dma_device_type": 1 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.234 "dma_device_type": 2 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "system", 00:16:50.234 "dma_device_type": 1 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.234 "dma_device_type": 2 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "system", 00:16:50.234 "dma_device_type": 1 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.234 "dma_device_type": 2 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "system", 00:16:50.234 "dma_device_type": 1 00:16:50.234 }, 00:16:50.234 { 00:16:50.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.234 "dma_device_type": 2 00:16:50.234 } 00:16:50.234 ], 00:16:50.234 "driver_specific": { 00:16:50.234 "raid": { 00:16:50.234 "uuid": "5d5d61db-336e-4806-a99a-02e76fee950e", 00:16:50.234 "strip_size_kb": 64, 00:16:50.234 "state": "online", 00:16:50.234 "raid_level": "raid0", 00:16:50.234 "superblock": false, 00:16:50.234 "num_base_bdevs": 4, 00:16:50.234 "num_base_bdevs_discovered": 4, 00:16:50.234 "num_base_bdevs_operational": 4, 00:16:50.234 "base_bdevs_list": [ 00:16:50.234 { 00:16:50.234 "name": "NewBaseBdev", 00:16:50.234 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:50.234 "is_configured": true, 00:16:50.234 "data_offset": 0, 00:16:50.235 "data_size": 65536 00:16:50.235 }, 00:16:50.235 { 00:16:50.235 "name": "BaseBdev2", 00:16:50.235 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:50.235 "is_configured": true, 00:16:50.235 "data_offset": 0, 00:16:50.235 "data_size": 65536 00:16:50.235 }, 00:16:50.235 { 00:16:50.235 "name": "BaseBdev3", 00:16:50.235 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:50.235 "is_configured": true, 00:16:50.235 "data_offset": 0, 00:16:50.235 "data_size": 65536 00:16:50.235 }, 00:16:50.235 { 00:16:50.235 "name": "BaseBdev4", 00:16:50.235 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:50.235 "is_configured": true, 00:16:50.235 "data_offset": 0, 00:16:50.235 "data_size": 65536 00:16:50.235 } 00:16:50.235 ] 00:16:50.235 } 00:16:50.235 } 00:16:50.235 }' 00:16:50.235 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:50.495 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:50.495 BaseBdev2 00:16:50.495 BaseBdev3 00:16:50.495 BaseBdev4' 00:16:50.495 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:50.495 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:50.495 07:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:50.495 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:50.495 "name": "NewBaseBdev", 00:16:50.495 "aliases": [ 00:16:50.495 "9015183e-c0f0-4024-8e93-3fd29500f7f7" 00:16:50.495 ], 00:16:50.495 "product_name": "Malloc disk", 00:16:50.495 "block_size": 512, 00:16:50.495 "num_blocks": 65536, 00:16:50.495 "uuid": "9015183e-c0f0-4024-8e93-3fd29500f7f7", 00:16:50.495 "assigned_rate_limits": { 00:16:50.495 "rw_ios_per_sec": 0, 00:16:50.495 "rw_mbytes_per_sec": 0, 00:16:50.495 "r_mbytes_per_sec": 0, 00:16:50.495 "w_mbytes_per_sec": 0 00:16:50.495 }, 00:16:50.495 "claimed": true, 00:16:50.495 "claim_type": "exclusive_write", 00:16:50.495 "zoned": false, 00:16:50.495 "supported_io_types": { 00:16:50.495 "read": true, 00:16:50.495 "write": true, 00:16:50.495 "unmap": true, 00:16:50.495 "flush": true, 00:16:50.495 "reset": true, 00:16:50.495 "nvme_admin": false, 00:16:50.495 "nvme_io": false, 00:16:50.495 "nvme_io_md": false, 00:16:50.495 "write_zeroes": true, 00:16:50.495 "zcopy": true, 00:16:50.495 "get_zone_info": false, 00:16:50.495 "zone_management": false, 00:16:50.495 "zone_append": false, 00:16:50.495 "compare": false, 00:16:50.495 "compare_and_write": false, 00:16:50.495 "abort": true, 00:16:50.495 "seek_hole": false, 00:16:50.495 "seek_data": false, 00:16:50.495 "copy": true, 00:16:50.495 "nvme_iov_md": false 00:16:50.495 }, 00:16:50.495 "memory_domains": [ 00:16:50.495 { 00:16:50.495 "dma_device_id": "system", 00:16:50.495 "dma_device_type": 1 00:16:50.495 }, 00:16:50.495 { 00:16:50.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.495 "dma_device_type": 2 00:16:50.495 } 00:16:50.495 ], 00:16:50.495 "driver_specific": {} 00:16:50.495 }' 00:16:50.495 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.495 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.754 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:50.754 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.754 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.754 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:50.754 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.754 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.754 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:50.755 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.014 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.014 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.014 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.014 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:51.014 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.014 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.014 "name": "BaseBdev2", 00:16:51.014 "aliases": [ 00:16:51.014 "1472d416-2238-4267-b12a-0424edeccb16" 00:16:51.014 ], 00:16:51.014 "product_name": "Malloc disk", 00:16:51.014 "block_size": 512, 00:16:51.014 "num_blocks": 65536, 00:16:51.014 "uuid": "1472d416-2238-4267-b12a-0424edeccb16", 00:16:51.014 "assigned_rate_limits": { 00:16:51.014 "rw_ios_per_sec": 0, 00:16:51.014 "rw_mbytes_per_sec": 0, 00:16:51.014 "r_mbytes_per_sec": 0, 00:16:51.014 "w_mbytes_per_sec": 0 00:16:51.014 }, 00:16:51.014 "claimed": true, 00:16:51.014 "claim_type": "exclusive_write", 00:16:51.014 "zoned": false, 00:16:51.014 "supported_io_types": { 00:16:51.014 "read": true, 00:16:51.014 "write": true, 00:16:51.014 "unmap": true, 00:16:51.014 "flush": true, 00:16:51.014 "reset": true, 00:16:51.014 "nvme_admin": false, 00:16:51.014 "nvme_io": false, 00:16:51.014 "nvme_io_md": false, 00:16:51.014 "write_zeroes": true, 00:16:51.014 "zcopy": true, 00:16:51.014 "get_zone_info": false, 00:16:51.014 "zone_management": false, 00:16:51.014 "zone_append": false, 00:16:51.014 "compare": false, 00:16:51.014 "compare_and_write": false, 00:16:51.014 "abort": true, 00:16:51.014 "seek_hole": false, 00:16:51.014 "seek_data": false, 00:16:51.014 "copy": true, 00:16:51.014 "nvme_iov_md": false 00:16:51.014 }, 00:16:51.014 "memory_domains": [ 00:16:51.014 { 00:16:51.014 "dma_device_id": "system", 00:16:51.014 "dma_device_type": 1 00:16:51.014 }, 00:16:51.014 { 00:16:51.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.014 "dma_device_type": 2 00:16:51.014 } 00:16:51.014 ], 00:16:51.014 "driver_specific": {} 00:16:51.014 }' 00:16:51.014 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.275 07:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.535 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.535 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.535 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.535 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:51.535 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.535 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.535 "name": "BaseBdev3", 00:16:51.535 "aliases": [ 00:16:51.535 "f2c66c58-53a6-4a84-b393-15265c107e1e" 00:16:51.535 ], 00:16:51.535 "product_name": "Malloc disk", 00:16:51.535 "block_size": 512, 00:16:51.535 "num_blocks": 65536, 00:16:51.535 "uuid": "f2c66c58-53a6-4a84-b393-15265c107e1e", 00:16:51.535 "assigned_rate_limits": { 00:16:51.535 "rw_ios_per_sec": 0, 00:16:51.535 "rw_mbytes_per_sec": 0, 00:16:51.535 "r_mbytes_per_sec": 0, 00:16:51.535 "w_mbytes_per_sec": 0 00:16:51.535 }, 00:16:51.535 "claimed": true, 00:16:51.535 "claim_type": "exclusive_write", 00:16:51.535 "zoned": false, 00:16:51.535 "supported_io_types": { 00:16:51.535 "read": true, 00:16:51.535 "write": true, 00:16:51.536 "unmap": true, 00:16:51.536 "flush": true, 00:16:51.536 "reset": true, 00:16:51.536 "nvme_admin": false, 00:16:51.536 "nvme_io": false, 00:16:51.536 "nvme_io_md": false, 00:16:51.536 "write_zeroes": true, 00:16:51.536 "zcopy": true, 00:16:51.536 "get_zone_info": false, 00:16:51.536 "zone_management": false, 00:16:51.536 "zone_append": false, 00:16:51.536 "compare": false, 00:16:51.536 "compare_and_write": false, 00:16:51.536 "abort": true, 00:16:51.536 "seek_hole": false, 00:16:51.536 "seek_data": false, 00:16:51.536 "copy": true, 00:16:51.536 "nvme_iov_md": false 00:16:51.536 }, 00:16:51.536 "memory_domains": [ 00:16:51.536 { 00:16:51.536 "dma_device_id": "system", 00:16:51.536 "dma_device_type": 1 00:16:51.536 }, 00:16:51.536 { 00:16:51.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.536 "dma_device_type": 2 00:16:51.536 } 00:16:51.536 ], 00:16:51.536 "driver_specific": {} 00:16:51.536 }' 00:16:51.536 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.536 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.795 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.054 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.054 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.054 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:52.054 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.054 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.054 "name": "BaseBdev4", 00:16:52.054 "aliases": [ 00:16:52.054 "944301e6-1859-4992-89fb-e3ed8a2c8384" 00:16:52.054 ], 00:16:52.054 "product_name": "Malloc disk", 00:16:52.054 "block_size": 512, 00:16:52.054 "num_blocks": 65536, 00:16:52.054 "uuid": "944301e6-1859-4992-89fb-e3ed8a2c8384", 00:16:52.054 "assigned_rate_limits": { 00:16:52.054 "rw_ios_per_sec": 0, 00:16:52.054 "rw_mbytes_per_sec": 0, 00:16:52.054 "r_mbytes_per_sec": 0, 00:16:52.054 "w_mbytes_per_sec": 0 00:16:52.054 }, 00:16:52.054 "claimed": true, 00:16:52.054 "claim_type": "exclusive_write", 00:16:52.054 "zoned": false, 00:16:52.054 "supported_io_types": { 00:16:52.054 "read": true, 00:16:52.054 "write": true, 00:16:52.054 "unmap": true, 00:16:52.054 "flush": true, 00:16:52.054 "reset": true, 00:16:52.054 "nvme_admin": false, 00:16:52.054 "nvme_io": false, 00:16:52.054 "nvme_io_md": false, 00:16:52.054 "write_zeroes": true, 00:16:52.054 "zcopy": true, 00:16:52.054 "get_zone_info": false, 00:16:52.054 "zone_management": false, 00:16:52.054 "zone_append": false, 00:16:52.054 "compare": false, 00:16:52.054 "compare_and_write": false, 00:16:52.054 "abort": true, 00:16:52.054 "seek_hole": false, 00:16:52.054 "seek_data": false, 00:16:52.054 "copy": true, 00:16:52.054 "nvme_iov_md": false 00:16:52.054 }, 00:16:52.054 "memory_domains": [ 00:16:52.054 { 00:16:52.054 "dma_device_id": "system", 00:16:52.054 "dma_device_type": 1 00:16:52.054 }, 00:16:52.054 { 00:16:52.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.054 "dma_device_type": 2 00:16:52.054 } 00:16:52.054 ], 00:16:52.054 "driver_specific": {} 00:16:52.054 }' 00:16:52.054 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.313 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.313 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.313 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.313 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.313 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.313 07:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.313 07:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.313 07:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.313 07:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.573 07:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.573 07:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.573 07:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:52.573 [2024-07-15 07:52:37.323082] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:52.573 [2024-07-15 07:52:37.323100] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:52.573 [2024-07-15 07:52:37.323137] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:52.573 [2024-07-15 07:52:37.323180] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:52.573 [2024-07-15 07:52:37.323187] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x220aba0 name Existed_Raid, state offline 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1654553 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1654553 ']' 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1654553 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1654553 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1654553' 00:16:52.833 killing process with pid 1654553 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1654553 00:16:52.833 [2024-07-15 07:52:37.390798] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1654553 00:16:52.833 [2024-07-15 07:52:37.411122] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:52.833 00:16:52.833 real 0m27.037s 00:16:52.833 user 0m50.743s 00:16:52.833 sys 0m3.963s 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:52.833 07:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.833 ************************************ 00:16:52.833 END TEST raid_state_function_test 00:16:52.833 ************************************ 00:16:52.833 07:52:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:52.833 07:52:37 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:16:52.833 07:52:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:52.833 07:52:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:52.833 07:52:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:53.093 ************************************ 00:16:53.093 START TEST raid_state_function_test_sb 00:16:53.093 ************************************ 00:16:53.093 07:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:16:53.093 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1659812 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1659812' 00:16:53.094 Process raid pid: 1659812 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1659812 /var/tmp/spdk-raid.sock 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1659812 ']' 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:53.094 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:53.094 07:52:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:53.094 [2024-07-15 07:52:37.674508] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:16:53.094 [2024-07-15 07:52:37.674558] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:53.094 [2024-07-15 07:52:37.763666] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:53.094 [2024-07-15 07:52:37.832008] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:53.353 [2024-07-15 07:52:37.871547] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:53.353 [2024-07-15 07:52:37.871568] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:53.923 07:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:53.923 07:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:53.923 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:53.923 [2024-07-15 07:52:38.674343] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:53.923 [2024-07-15 07:52:38.674369] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:53.923 [2024-07-15 07:52:38.674375] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:53.923 [2024-07-15 07:52:38.674381] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:53.923 [2024-07-15 07:52:38.674385] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:53.923 [2024-07-15 07:52:38.674391] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:53.923 [2024-07-15 07:52:38.674395] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:53.923 [2024-07-15 07:52:38.674400] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.183 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.183 "name": "Existed_Raid", 00:16:54.183 "uuid": "b62c3777-9b4f-436c-8de2-c2a8c60d8a6a", 00:16:54.183 "strip_size_kb": 64, 00:16:54.183 "state": "configuring", 00:16:54.183 "raid_level": "raid0", 00:16:54.183 "superblock": true, 00:16:54.183 "num_base_bdevs": 4, 00:16:54.183 "num_base_bdevs_discovered": 0, 00:16:54.184 "num_base_bdevs_operational": 4, 00:16:54.184 "base_bdevs_list": [ 00:16:54.184 { 00:16:54.184 "name": "BaseBdev1", 00:16:54.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.184 "is_configured": false, 00:16:54.184 "data_offset": 0, 00:16:54.184 "data_size": 0 00:16:54.184 }, 00:16:54.184 { 00:16:54.184 "name": "BaseBdev2", 00:16:54.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.184 "is_configured": false, 00:16:54.184 "data_offset": 0, 00:16:54.184 "data_size": 0 00:16:54.184 }, 00:16:54.184 { 00:16:54.184 "name": "BaseBdev3", 00:16:54.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.184 "is_configured": false, 00:16:54.184 "data_offset": 0, 00:16:54.184 "data_size": 0 00:16:54.184 }, 00:16:54.184 { 00:16:54.184 "name": "BaseBdev4", 00:16:54.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.184 "is_configured": false, 00:16:54.184 "data_offset": 0, 00:16:54.184 "data_size": 0 00:16:54.184 } 00:16:54.184 ] 00:16:54.184 }' 00:16:54.184 07:52:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.184 07:52:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:54.755 07:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:55.015 [2024-07-15 07:52:39.576519] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:55.015 [2024-07-15 07:52:39.576534] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27026f0 name Existed_Raid, state configuring 00:16:55.015 07:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:55.015 [2024-07-15 07:52:39.761012] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:55.015 [2024-07-15 07:52:39.761026] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:55.015 [2024-07-15 07:52:39.761031] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:55.015 [2024-07-15 07:52:39.761037] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:55.015 [2024-07-15 07:52:39.761041] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:55.015 [2024-07-15 07:52:39.761047] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:55.015 [2024-07-15 07:52:39.761051] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:55.015 [2024-07-15 07:52:39.761057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:55.275 07:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:55.275 [2024-07-15 07:52:39.952093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:55.275 BaseBdev1 00:16:55.275 07:52:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:55.275 07:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:55.275 07:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:55.275 07:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:55.275 07:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:55.275 07:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:55.275 07:52:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:55.536 07:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:55.799 [ 00:16:55.799 { 00:16:55.799 "name": "BaseBdev1", 00:16:55.799 "aliases": [ 00:16:55.799 "8777065f-7eae-473e-9b3c-e727e082646a" 00:16:55.799 ], 00:16:55.799 "product_name": "Malloc disk", 00:16:55.799 "block_size": 512, 00:16:55.799 "num_blocks": 65536, 00:16:55.799 "uuid": "8777065f-7eae-473e-9b3c-e727e082646a", 00:16:55.799 "assigned_rate_limits": { 00:16:55.799 "rw_ios_per_sec": 0, 00:16:55.799 "rw_mbytes_per_sec": 0, 00:16:55.799 "r_mbytes_per_sec": 0, 00:16:55.799 "w_mbytes_per_sec": 0 00:16:55.799 }, 00:16:55.799 "claimed": true, 00:16:55.799 "claim_type": "exclusive_write", 00:16:55.799 "zoned": false, 00:16:55.799 "supported_io_types": { 00:16:55.799 "read": true, 00:16:55.799 "write": true, 00:16:55.799 "unmap": true, 00:16:55.799 "flush": true, 00:16:55.799 "reset": true, 00:16:55.799 "nvme_admin": false, 00:16:55.799 "nvme_io": false, 00:16:55.799 "nvme_io_md": false, 00:16:55.799 "write_zeroes": true, 00:16:55.799 "zcopy": true, 00:16:55.799 "get_zone_info": false, 00:16:55.799 "zone_management": false, 00:16:55.799 "zone_append": false, 00:16:55.799 "compare": false, 00:16:55.799 "compare_and_write": false, 00:16:55.799 "abort": true, 00:16:55.799 "seek_hole": false, 00:16:55.799 "seek_data": false, 00:16:55.799 "copy": true, 00:16:55.799 "nvme_iov_md": false 00:16:55.799 }, 00:16:55.799 "memory_domains": [ 00:16:55.799 { 00:16:55.799 "dma_device_id": "system", 00:16:55.799 "dma_device_type": 1 00:16:55.799 }, 00:16:55.799 { 00:16:55.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.799 "dma_device_type": 2 00:16:55.799 } 00:16:55.799 ], 00:16:55.799 "driver_specific": {} 00:16:55.799 } 00:16:55.799 ] 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:55.799 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:55.799 "name": "Existed_Raid", 00:16:55.799 "uuid": "68febca1-41c0-40e1-945a-43fb61c65378", 00:16:55.799 "strip_size_kb": 64, 00:16:55.799 "state": "configuring", 00:16:55.799 "raid_level": "raid0", 00:16:55.799 "superblock": true, 00:16:55.799 "num_base_bdevs": 4, 00:16:55.799 "num_base_bdevs_discovered": 1, 00:16:55.799 "num_base_bdevs_operational": 4, 00:16:55.799 "base_bdevs_list": [ 00:16:55.799 { 00:16:55.799 "name": "BaseBdev1", 00:16:55.799 "uuid": "8777065f-7eae-473e-9b3c-e727e082646a", 00:16:55.799 "is_configured": true, 00:16:55.799 "data_offset": 2048, 00:16:55.799 "data_size": 63488 00:16:55.799 }, 00:16:55.799 { 00:16:55.799 "name": "BaseBdev2", 00:16:55.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.799 "is_configured": false, 00:16:55.799 "data_offset": 0, 00:16:55.799 "data_size": 0 00:16:55.799 }, 00:16:55.799 { 00:16:55.799 "name": "BaseBdev3", 00:16:55.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.799 "is_configured": false, 00:16:55.799 "data_offset": 0, 00:16:55.799 "data_size": 0 00:16:55.799 }, 00:16:55.799 { 00:16:55.799 "name": "BaseBdev4", 00:16:55.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.799 "is_configured": false, 00:16:55.800 "data_offset": 0, 00:16:55.800 "data_size": 0 00:16:55.800 } 00:16:55.800 ] 00:16:55.800 }' 00:16:55.800 07:52:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:55.800 07:52:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:56.779 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:57.037 [2024-07-15 07:52:41.612304] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:57.037 [2024-07-15 07:52:41.612332] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2701f60 name Existed_Raid, state configuring 00:16:57.037 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:57.296 [2024-07-15 07:52:41.800818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:57.296 [2024-07-15 07:52:41.801914] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:57.296 [2024-07-15 07:52:41.801936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:57.296 [2024-07-15 07:52:41.801942] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:57.296 [2024-07-15 07:52:41.801948] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:57.296 [2024-07-15 07:52:41.801956] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:57.296 [2024-07-15 07:52:41.801962] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.297 "name": "Existed_Raid", 00:16:57.297 "uuid": "02317058-089b-49f8-9692-74ec6d139eec", 00:16:57.297 "strip_size_kb": 64, 00:16:57.297 "state": "configuring", 00:16:57.297 "raid_level": "raid0", 00:16:57.297 "superblock": true, 00:16:57.297 "num_base_bdevs": 4, 00:16:57.297 "num_base_bdevs_discovered": 1, 00:16:57.297 "num_base_bdevs_operational": 4, 00:16:57.297 "base_bdevs_list": [ 00:16:57.297 { 00:16:57.297 "name": "BaseBdev1", 00:16:57.297 "uuid": "8777065f-7eae-473e-9b3c-e727e082646a", 00:16:57.297 "is_configured": true, 00:16:57.297 "data_offset": 2048, 00:16:57.297 "data_size": 63488 00:16:57.297 }, 00:16:57.297 { 00:16:57.297 "name": "BaseBdev2", 00:16:57.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.297 "is_configured": false, 00:16:57.297 "data_offset": 0, 00:16:57.297 "data_size": 0 00:16:57.297 }, 00:16:57.297 { 00:16:57.297 "name": "BaseBdev3", 00:16:57.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.297 "is_configured": false, 00:16:57.297 "data_offset": 0, 00:16:57.297 "data_size": 0 00:16:57.297 }, 00:16:57.297 { 00:16:57.297 "name": "BaseBdev4", 00:16:57.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.297 "is_configured": false, 00:16:57.297 "data_offset": 0, 00:16:57.297 "data_size": 0 00:16:57.297 } 00:16:57.297 ] 00:16:57.297 }' 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.297 07:52:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:57.865 07:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:58.125 [2024-07-15 07:52:42.728166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:58.125 BaseBdev2 00:16:58.125 07:52:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:58.125 07:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:58.125 07:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:58.125 07:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:58.125 07:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:58.125 07:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:58.125 07:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:58.385 07:52:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:58.385 [ 00:16:58.385 { 00:16:58.385 "name": "BaseBdev2", 00:16:58.385 "aliases": [ 00:16:58.385 "27a04355-5dbd-4900-a936-bf739fc2a8b8" 00:16:58.385 ], 00:16:58.385 "product_name": "Malloc disk", 00:16:58.385 "block_size": 512, 00:16:58.385 "num_blocks": 65536, 00:16:58.385 "uuid": "27a04355-5dbd-4900-a936-bf739fc2a8b8", 00:16:58.385 "assigned_rate_limits": { 00:16:58.385 "rw_ios_per_sec": 0, 00:16:58.385 "rw_mbytes_per_sec": 0, 00:16:58.385 "r_mbytes_per_sec": 0, 00:16:58.385 "w_mbytes_per_sec": 0 00:16:58.385 }, 00:16:58.385 "claimed": true, 00:16:58.385 "claim_type": "exclusive_write", 00:16:58.385 "zoned": false, 00:16:58.385 "supported_io_types": { 00:16:58.385 "read": true, 00:16:58.385 "write": true, 00:16:58.385 "unmap": true, 00:16:58.385 "flush": true, 00:16:58.385 "reset": true, 00:16:58.385 "nvme_admin": false, 00:16:58.385 "nvme_io": false, 00:16:58.385 "nvme_io_md": false, 00:16:58.385 "write_zeroes": true, 00:16:58.385 "zcopy": true, 00:16:58.385 "get_zone_info": false, 00:16:58.385 "zone_management": false, 00:16:58.385 "zone_append": false, 00:16:58.385 "compare": false, 00:16:58.385 "compare_and_write": false, 00:16:58.385 "abort": true, 00:16:58.385 "seek_hole": false, 00:16:58.385 "seek_data": false, 00:16:58.385 "copy": true, 00:16:58.385 "nvme_iov_md": false 00:16:58.385 }, 00:16:58.385 "memory_domains": [ 00:16:58.385 { 00:16:58.385 "dma_device_id": "system", 00:16:58.385 "dma_device_type": 1 00:16:58.385 }, 00:16:58.385 { 00:16:58.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.385 "dma_device_type": 2 00:16:58.385 } 00:16:58.385 ], 00:16:58.385 "driver_specific": {} 00:16:58.385 } 00:16:58.385 ] 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.385 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.655 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.655 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.655 "name": "Existed_Raid", 00:16:58.655 "uuid": "02317058-089b-49f8-9692-74ec6d139eec", 00:16:58.655 "strip_size_kb": 64, 00:16:58.655 "state": "configuring", 00:16:58.656 "raid_level": "raid0", 00:16:58.656 "superblock": true, 00:16:58.656 "num_base_bdevs": 4, 00:16:58.656 "num_base_bdevs_discovered": 2, 00:16:58.656 "num_base_bdevs_operational": 4, 00:16:58.656 "base_bdevs_list": [ 00:16:58.656 { 00:16:58.656 "name": "BaseBdev1", 00:16:58.656 "uuid": "8777065f-7eae-473e-9b3c-e727e082646a", 00:16:58.656 "is_configured": true, 00:16:58.656 "data_offset": 2048, 00:16:58.656 "data_size": 63488 00:16:58.656 }, 00:16:58.656 { 00:16:58.656 "name": "BaseBdev2", 00:16:58.656 "uuid": "27a04355-5dbd-4900-a936-bf739fc2a8b8", 00:16:58.656 "is_configured": true, 00:16:58.656 "data_offset": 2048, 00:16:58.656 "data_size": 63488 00:16:58.656 }, 00:16:58.656 { 00:16:58.656 "name": "BaseBdev3", 00:16:58.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.656 "is_configured": false, 00:16:58.656 "data_offset": 0, 00:16:58.656 "data_size": 0 00:16:58.656 }, 00:16:58.656 { 00:16:58.656 "name": "BaseBdev4", 00:16:58.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.657 "is_configured": false, 00:16:58.657 "data_offset": 0, 00:16:58.657 "data_size": 0 00:16:58.657 } 00:16:58.657 ] 00:16:58.657 }' 00:16:58.657 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.657 07:52:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:59.234 07:52:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:59.495 [2024-07-15 07:52:44.052473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:59.495 BaseBdev3 00:16:59.495 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:59.495 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:59.495 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:59.495 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:59.495 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:59.495 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:59.495 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:59.755 [ 00:16:59.755 { 00:16:59.755 "name": "BaseBdev3", 00:16:59.755 "aliases": [ 00:16:59.755 "001169f3-7852-4d3e-9d36-91db09c2d4ef" 00:16:59.755 ], 00:16:59.755 "product_name": "Malloc disk", 00:16:59.755 "block_size": 512, 00:16:59.755 "num_blocks": 65536, 00:16:59.755 "uuid": "001169f3-7852-4d3e-9d36-91db09c2d4ef", 00:16:59.755 "assigned_rate_limits": { 00:16:59.755 "rw_ios_per_sec": 0, 00:16:59.755 "rw_mbytes_per_sec": 0, 00:16:59.755 "r_mbytes_per_sec": 0, 00:16:59.755 "w_mbytes_per_sec": 0 00:16:59.755 }, 00:16:59.755 "claimed": true, 00:16:59.755 "claim_type": "exclusive_write", 00:16:59.755 "zoned": false, 00:16:59.755 "supported_io_types": { 00:16:59.755 "read": true, 00:16:59.755 "write": true, 00:16:59.755 "unmap": true, 00:16:59.755 "flush": true, 00:16:59.755 "reset": true, 00:16:59.755 "nvme_admin": false, 00:16:59.755 "nvme_io": false, 00:16:59.755 "nvme_io_md": false, 00:16:59.755 "write_zeroes": true, 00:16:59.755 "zcopy": true, 00:16:59.755 "get_zone_info": false, 00:16:59.755 "zone_management": false, 00:16:59.755 "zone_append": false, 00:16:59.755 "compare": false, 00:16:59.755 "compare_and_write": false, 00:16:59.755 "abort": true, 00:16:59.755 "seek_hole": false, 00:16:59.755 "seek_data": false, 00:16:59.755 "copy": true, 00:16:59.755 "nvme_iov_md": false 00:16:59.755 }, 00:16:59.755 "memory_domains": [ 00:16:59.755 { 00:16:59.755 "dma_device_id": "system", 00:16:59.755 "dma_device_type": 1 00:16:59.755 }, 00:16:59.755 { 00:16:59.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.755 "dma_device_type": 2 00:16:59.755 } 00:16:59.755 ], 00:16:59.755 "driver_specific": {} 00:16:59.755 } 00:16:59.755 ] 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.755 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.016 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.016 "name": "Existed_Raid", 00:17:00.016 "uuid": "02317058-089b-49f8-9692-74ec6d139eec", 00:17:00.016 "strip_size_kb": 64, 00:17:00.016 "state": "configuring", 00:17:00.016 "raid_level": "raid0", 00:17:00.016 "superblock": true, 00:17:00.016 "num_base_bdevs": 4, 00:17:00.016 "num_base_bdevs_discovered": 3, 00:17:00.016 "num_base_bdevs_operational": 4, 00:17:00.016 "base_bdevs_list": [ 00:17:00.016 { 00:17:00.016 "name": "BaseBdev1", 00:17:00.016 "uuid": "8777065f-7eae-473e-9b3c-e727e082646a", 00:17:00.016 "is_configured": true, 00:17:00.016 "data_offset": 2048, 00:17:00.016 "data_size": 63488 00:17:00.016 }, 00:17:00.016 { 00:17:00.016 "name": "BaseBdev2", 00:17:00.016 "uuid": "27a04355-5dbd-4900-a936-bf739fc2a8b8", 00:17:00.016 "is_configured": true, 00:17:00.016 "data_offset": 2048, 00:17:00.016 "data_size": 63488 00:17:00.016 }, 00:17:00.016 { 00:17:00.016 "name": "BaseBdev3", 00:17:00.016 "uuid": "001169f3-7852-4d3e-9d36-91db09c2d4ef", 00:17:00.016 "is_configured": true, 00:17:00.016 "data_offset": 2048, 00:17:00.016 "data_size": 63488 00:17:00.016 }, 00:17:00.016 { 00:17:00.016 "name": "BaseBdev4", 00:17:00.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.016 "is_configured": false, 00:17:00.016 "data_offset": 0, 00:17:00.016 "data_size": 0 00:17:00.016 } 00:17:00.016 ] 00:17:00.016 }' 00:17:00.016 07:52:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.016 07:52:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:00.584 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:00.844 [2024-07-15 07:52:45.348800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:00.844 [2024-07-15 07:52:45.348925] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2702fc0 00:17:00.844 [2024-07-15 07:52:45.348933] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:00.844 [2024-07-15 07:52:45.349072] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2702c00 00:17:00.844 [2024-07-15 07:52:45.349166] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2702fc0 00:17:00.844 [2024-07-15 07:52:45.349172] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2702fc0 00:17:00.844 [2024-07-15 07:52:45.349240] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:00.844 BaseBdev4 00:17:00.844 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:00.844 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:00.844 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:00.844 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:00.844 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:00.844 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:00.844 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:00.844 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:01.104 [ 00:17:01.104 { 00:17:01.104 "name": "BaseBdev4", 00:17:01.104 "aliases": [ 00:17:01.104 "2def703a-c35a-4467-a97f-babc9dede28d" 00:17:01.104 ], 00:17:01.104 "product_name": "Malloc disk", 00:17:01.104 "block_size": 512, 00:17:01.104 "num_blocks": 65536, 00:17:01.104 "uuid": "2def703a-c35a-4467-a97f-babc9dede28d", 00:17:01.104 "assigned_rate_limits": { 00:17:01.104 "rw_ios_per_sec": 0, 00:17:01.104 "rw_mbytes_per_sec": 0, 00:17:01.105 "r_mbytes_per_sec": 0, 00:17:01.105 "w_mbytes_per_sec": 0 00:17:01.105 }, 00:17:01.105 "claimed": true, 00:17:01.105 "claim_type": "exclusive_write", 00:17:01.105 "zoned": false, 00:17:01.105 "supported_io_types": { 00:17:01.105 "read": true, 00:17:01.105 "write": true, 00:17:01.105 "unmap": true, 00:17:01.105 "flush": true, 00:17:01.105 "reset": true, 00:17:01.105 "nvme_admin": false, 00:17:01.105 "nvme_io": false, 00:17:01.105 "nvme_io_md": false, 00:17:01.105 "write_zeroes": true, 00:17:01.105 "zcopy": true, 00:17:01.105 "get_zone_info": false, 00:17:01.105 "zone_management": false, 00:17:01.105 "zone_append": false, 00:17:01.105 "compare": false, 00:17:01.105 "compare_and_write": false, 00:17:01.105 "abort": true, 00:17:01.105 "seek_hole": false, 00:17:01.105 "seek_data": false, 00:17:01.105 "copy": true, 00:17:01.105 "nvme_iov_md": false 00:17:01.105 }, 00:17:01.105 "memory_domains": [ 00:17:01.105 { 00:17:01.105 "dma_device_id": "system", 00:17:01.105 "dma_device_type": 1 00:17:01.105 }, 00:17:01.105 { 00:17:01.105 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.105 "dma_device_type": 2 00:17:01.105 } 00:17:01.105 ], 00:17:01.105 "driver_specific": {} 00:17:01.105 } 00:17:01.105 ] 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.105 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.365 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.365 "name": "Existed_Raid", 00:17:01.365 "uuid": "02317058-089b-49f8-9692-74ec6d139eec", 00:17:01.365 "strip_size_kb": 64, 00:17:01.365 "state": "online", 00:17:01.365 "raid_level": "raid0", 00:17:01.365 "superblock": true, 00:17:01.365 "num_base_bdevs": 4, 00:17:01.365 "num_base_bdevs_discovered": 4, 00:17:01.365 "num_base_bdevs_operational": 4, 00:17:01.365 "base_bdevs_list": [ 00:17:01.365 { 00:17:01.365 "name": "BaseBdev1", 00:17:01.365 "uuid": "8777065f-7eae-473e-9b3c-e727e082646a", 00:17:01.365 "is_configured": true, 00:17:01.365 "data_offset": 2048, 00:17:01.365 "data_size": 63488 00:17:01.365 }, 00:17:01.365 { 00:17:01.365 "name": "BaseBdev2", 00:17:01.365 "uuid": "27a04355-5dbd-4900-a936-bf739fc2a8b8", 00:17:01.365 "is_configured": true, 00:17:01.365 "data_offset": 2048, 00:17:01.365 "data_size": 63488 00:17:01.365 }, 00:17:01.365 { 00:17:01.365 "name": "BaseBdev3", 00:17:01.365 "uuid": "001169f3-7852-4d3e-9d36-91db09c2d4ef", 00:17:01.365 "is_configured": true, 00:17:01.365 "data_offset": 2048, 00:17:01.365 "data_size": 63488 00:17:01.365 }, 00:17:01.365 { 00:17:01.365 "name": "BaseBdev4", 00:17:01.365 "uuid": "2def703a-c35a-4467-a97f-babc9dede28d", 00:17:01.365 "is_configured": true, 00:17:01.365 "data_offset": 2048, 00:17:01.365 "data_size": 63488 00:17:01.365 } 00:17:01.365 ] 00:17:01.365 }' 00:17:01.365 07:52:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.365 07:52:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:01.935 [2024-07-15 07:52:46.660393] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:01.935 "name": "Existed_Raid", 00:17:01.935 "aliases": [ 00:17:01.935 "02317058-089b-49f8-9692-74ec6d139eec" 00:17:01.935 ], 00:17:01.935 "product_name": "Raid Volume", 00:17:01.935 "block_size": 512, 00:17:01.935 "num_blocks": 253952, 00:17:01.935 "uuid": "02317058-089b-49f8-9692-74ec6d139eec", 00:17:01.935 "assigned_rate_limits": { 00:17:01.935 "rw_ios_per_sec": 0, 00:17:01.935 "rw_mbytes_per_sec": 0, 00:17:01.935 "r_mbytes_per_sec": 0, 00:17:01.935 "w_mbytes_per_sec": 0 00:17:01.935 }, 00:17:01.935 "claimed": false, 00:17:01.935 "zoned": false, 00:17:01.935 "supported_io_types": { 00:17:01.935 "read": true, 00:17:01.935 "write": true, 00:17:01.935 "unmap": true, 00:17:01.935 "flush": true, 00:17:01.935 "reset": true, 00:17:01.935 "nvme_admin": false, 00:17:01.935 "nvme_io": false, 00:17:01.935 "nvme_io_md": false, 00:17:01.935 "write_zeroes": true, 00:17:01.935 "zcopy": false, 00:17:01.935 "get_zone_info": false, 00:17:01.935 "zone_management": false, 00:17:01.935 "zone_append": false, 00:17:01.935 "compare": false, 00:17:01.935 "compare_and_write": false, 00:17:01.935 "abort": false, 00:17:01.935 "seek_hole": false, 00:17:01.935 "seek_data": false, 00:17:01.935 "copy": false, 00:17:01.935 "nvme_iov_md": false 00:17:01.935 }, 00:17:01.935 "memory_domains": [ 00:17:01.935 { 00:17:01.935 "dma_device_id": "system", 00:17:01.935 "dma_device_type": 1 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.935 "dma_device_type": 2 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "dma_device_id": "system", 00:17:01.935 "dma_device_type": 1 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.935 "dma_device_type": 2 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "dma_device_id": "system", 00:17:01.935 "dma_device_type": 1 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.935 "dma_device_type": 2 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "dma_device_id": "system", 00:17:01.935 "dma_device_type": 1 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.935 "dma_device_type": 2 00:17:01.935 } 00:17:01.935 ], 00:17:01.935 "driver_specific": { 00:17:01.935 "raid": { 00:17:01.935 "uuid": "02317058-089b-49f8-9692-74ec6d139eec", 00:17:01.935 "strip_size_kb": 64, 00:17:01.935 "state": "online", 00:17:01.935 "raid_level": "raid0", 00:17:01.935 "superblock": true, 00:17:01.935 "num_base_bdevs": 4, 00:17:01.935 "num_base_bdevs_discovered": 4, 00:17:01.935 "num_base_bdevs_operational": 4, 00:17:01.935 "base_bdevs_list": [ 00:17:01.935 { 00:17:01.935 "name": "BaseBdev1", 00:17:01.935 "uuid": "8777065f-7eae-473e-9b3c-e727e082646a", 00:17:01.935 "is_configured": true, 00:17:01.935 "data_offset": 2048, 00:17:01.935 "data_size": 63488 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "name": "BaseBdev2", 00:17:01.935 "uuid": "27a04355-5dbd-4900-a936-bf739fc2a8b8", 00:17:01.935 "is_configured": true, 00:17:01.935 "data_offset": 2048, 00:17:01.935 "data_size": 63488 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "name": "BaseBdev3", 00:17:01.935 "uuid": "001169f3-7852-4d3e-9d36-91db09c2d4ef", 00:17:01.935 "is_configured": true, 00:17:01.935 "data_offset": 2048, 00:17:01.935 "data_size": 63488 00:17:01.935 }, 00:17:01.935 { 00:17:01.935 "name": "BaseBdev4", 00:17:01.935 "uuid": "2def703a-c35a-4467-a97f-babc9dede28d", 00:17:01.935 "is_configured": true, 00:17:01.935 "data_offset": 2048, 00:17:01.935 "data_size": 63488 00:17:01.935 } 00:17:01.935 ] 00:17:01.935 } 00:17:01.935 } 00:17:01.935 }' 00:17:01.935 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:02.196 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:02.196 BaseBdev2 00:17:02.196 BaseBdev3 00:17:02.196 BaseBdev4' 00:17:02.196 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.196 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:02.196 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.196 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.196 "name": "BaseBdev1", 00:17:02.196 "aliases": [ 00:17:02.196 "8777065f-7eae-473e-9b3c-e727e082646a" 00:17:02.196 ], 00:17:02.196 "product_name": "Malloc disk", 00:17:02.196 "block_size": 512, 00:17:02.196 "num_blocks": 65536, 00:17:02.196 "uuid": "8777065f-7eae-473e-9b3c-e727e082646a", 00:17:02.196 "assigned_rate_limits": { 00:17:02.196 "rw_ios_per_sec": 0, 00:17:02.196 "rw_mbytes_per_sec": 0, 00:17:02.196 "r_mbytes_per_sec": 0, 00:17:02.196 "w_mbytes_per_sec": 0 00:17:02.196 }, 00:17:02.196 "claimed": true, 00:17:02.196 "claim_type": "exclusive_write", 00:17:02.196 "zoned": false, 00:17:02.196 "supported_io_types": { 00:17:02.196 "read": true, 00:17:02.196 "write": true, 00:17:02.196 "unmap": true, 00:17:02.196 "flush": true, 00:17:02.196 "reset": true, 00:17:02.196 "nvme_admin": false, 00:17:02.196 "nvme_io": false, 00:17:02.196 "nvme_io_md": false, 00:17:02.196 "write_zeroes": true, 00:17:02.196 "zcopy": true, 00:17:02.196 "get_zone_info": false, 00:17:02.196 "zone_management": false, 00:17:02.196 "zone_append": false, 00:17:02.196 "compare": false, 00:17:02.196 "compare_and_write": false, 00:17:02.196 "abort": true, 00:17:02.196 "seek_hole": false, 00:17:02.196 "seek_data": false, 00:17:02.196 "copy": true, 00:17:02.196 "nvme_iov_md": false 00:17:02.196 }, 00:17:02.196 "memory_domains": [ 00:17:02.196 { 00:17:02.196 "dma_device_id": "system", 00:17:02.196 "dma_device_type": 1 00:17:02.196 }, 00:17:02.196 { 00:17:02.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.196 "dma_device_type": 2 00:17:02.196 } 00:17:02.196 ], 00:17:02.196 "driver_specific": {} 00:17:02.196 }' 00:17:02.196 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.196 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.456 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.456 07:52:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.456 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.716 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.716 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:02.716 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.716 "name": "BaseBdev2", 00:17:02.716 "aliases": [ 00:17:02.716 "27a04355-5dbd-4900-a936-bf739fc2a8b8" 00:17:02.716 ], 00:17:02.716 "product_name": "Malloc disk", 00:17:02.716 "block_size": 512, 00:17:02.716 "num_blocks": 65536, 00:17:02.716 "uuid": "27a04355-5dbd-4900-a936-bf739fc2a8b8", 00:17:02.716 "assigned_rate_limits": { 00:17:02.716 "rw_ios_per_sec": 0, 00:17:02.716 "rw_mbytes_per_sec": 0, 00:17:02.716 "r_mbytes_per_sec": 0, 00:17:02.716 "w_mbytes_per_sec": 0 00:17:02.716 }, 00:17:02.716 "claimed": true, 00:17:02.716 "claim_type": "exclusive_write", 00:17:02.716 "zoned": false, 00:17:02.716 "supported_io_types": { 00:17:02.716 "read": true, 00:17:02.716 "write": true, 00:17:02.716 "unmap": true, 00:17:02.716 "flush": true, 00:17:02.716 "reset": true, 00:17:02.716 "nvme_admin": false, 00:17:02.716 "nvme_io": false, 00:17:02.716 "nvme_io_md": false, 00:17:02.716 "write_zeroes": true, 00:17:02.716 "zcopy": true, 00:17:02.716 "get_zone_info": false, 00:17:02.716 "zone_management": false, 00:17:02.716 "zone_append": false, 00:17:02.716 "compare": false, 00:17:02.716 "compare_and_write": false, 00:17:02.716 "abort": true, 00:17:02.716 "seek_hole": false, 00:17:02.716 "seek_data": false, 00:17:02.716 "copy": true, 00:17:02.716 "nvme_iov_md": false 00:17:02.716 }, 00:17:02.716 "memory_domains": [ 00:17:02.716 { 00:17:02.716 "dma_device_id": "system", 00:17:02.716 "dma_device_type": 1 00:17:02.716 }, 00:17:02.716 { 00:17:02.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.716 "dma_device_type": 2 00:17:02.716 } 00:17:02.716 ], 00:17:02.716 "driver_specific": {} 00:17:02.716 }' 00:17:02.716 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.716 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.976 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.236 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.236 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.236 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:03.236 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.236 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.236 "name": "BaseBdev3", 00:17:03.236 "aliases": [ 00:17:03.236 "001169f3-7852-4d3e-9d36-91db09c2d4ef" 00:17:03.236 ], 00:17:03.236 "product_name": "Malloc disk", 00:17:03.236 "block_size": 512, 00:17:03.236 "num_blocks": 65536, 00:17:03.236 "uuid": "001169f3-7852-4d3e-9d36-91db09c2d4ef", 00:17:03.236 "assigned_rate_limits": { 00:17:03.236 "rw_ios_per_sec": 0, 00:17:03.236 "rw_mbytes_per_sec": 0, 00:17:03.236 "r_mbytes_per_sec": 0, 00:17:03.236 "w_mbytes_per_sec": 0 00:17:03.236 }, 00:17:03.236 "claimed": true, 00:17:03.236 "claim_type": "exclusive_write", 00:17:03.236 "zoned": false, 00:17:03.236 "supported_io_types": { 00:17:03.236 "read": true, 00:17:03.236 "write": true, 00:17:03.236 "unmap": true, 00:17:03.236 "flush": true, 00:17:03.236 "reset": true, 00:17:03.236 "nvme_admin": false, 00:17:03.236 "nvme_io": false, 00:17:03.236 "nvme_io_md": false, 00:17:03.236 "write_zeroes": true, 00:17:03.236 "zcopy": true, 00:17:03.236 "get_zone_info": false, 00:17:03.236 "zone_management": false, 00:17:03.236 "zone_append": false, 00:17:03.236 "compare": false, 00:17:03.236 "compare_and_write": false, 00:17:03.236 "abort": true, 00:17:03.236 "seek_hole": false, 00:17:03.236 "seek_data": false, 00:17:03.236 "copy": true, 00:17:03.236 "nvme_iov_md": false 00:17:03.236 }, 00:17:03.236 "memory_domains": [ 00:17:03.236 { 00:17:03.236 "dma_device_id": "system", 00:17:03.236 "dma_device_type": 1 00:17:03.236 }, 00:17:03.236 { 00:17:03.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.236 "dma_device_type": 2 00:17:03.236 } 00:17:03.236 ], 00:17:03.236 "driver_specific": {} 00:17:03.236 }' 00:17:03.236 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.236 07:52:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.496 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.496 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.496 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.496 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.496 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.496 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.496 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.496 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.756 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.756 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.756 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.756 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:03.756 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.756 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.756 "name": "BaseBdev4", 00:17:03.756 "aliases": [ 00:17:03.756 "2def703a-c35a-4467-a97f-babc9dede28d" 00:17:03.756 ], 00:17:03.756 "product_name": "Malloc disk", 00:17:03.756 "block_size": 512, 00:17:03.756 "num_blocks": 65536, 00:17:03.756 "uuid": "2def703a-c35a-4467-a97f-babc9dede28d", 00:17:03.756 "assigned_rate_limits": { 00:17:03.756 "rw_ios_per_sec": 0, 00:17:03.756 "rw_mbytes_per_sec": 0, 00:17:03.756 "r_mbytes_per_sec": 0, 00:17:03.756 "w_mbytes_per_sec": 0 00:17:03.756 }, 00:17:03.756 "claimed": true, 00:17:03.756 "claim_type": "exclusive_write", 00:17:03.756 "zoned": false, 00:17:03.756 "supported_io_types": { 00:17:03.756 "read": true, 00:17:03.756 "write": true, 00:17:03.756 "unmap": true, 00:17:03.756 "flush": true, 00:17:03.756 "reset": true, 00:17:03.756 "nvme_admin": false, 00:17:03.756 "nvme_io": false, 00:17:03.756 "nvme_io_md": false, 00:17:03.756 "write_zeroes": true, 00:17:03.756 "zcopy": true, 00:17:03.756 "get_zone_info": false, 00:17:03.756 "zone_management": false, 00:17:03.756 "zone_append": false, 00:17:03.756 "compare": false, 00:17:03.756 "compare_and_write": false, 00:17:03.756 "abort": true, 00:17:03.756 "seek_hole": false, 00:17:03.756 "seek_data": false, 00:17:03.756 "copy": true, 00:17:03.756 "nvme_iov_md": false 00:17:03.756 }, 00:17:03.756 "memory_domains": [ 00:17:03.756 { 00:17:03.756 "dma_device_id": "system", 00:17:03.756 "dma_device_type": 1 00:17:03.756 }, 00:17:03.756 { 00:17:03.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.756 "dma_device_type": 2 00:17:03.756 } 00:17:03.756 ], 00:17:03.756 "driver_specific": {} 00:17:03.756 }' 00:17:03.756 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:04.016 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.276 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.276 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:04.276 07:52:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:04.276 [2024-07-15 07:52:49.018156] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:04.276 [2024-07-15 07:52:49.018175] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:04.276 [2024-07-15 07:52:49.018211] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.537 "name": "Existed_Raid", 00:17:04.537 "uuid": "02317058-089b-49f8-9692-74ec6d139eec", 00:17:04.537 "strip_size_kb": 64, 00:17:04.537 "state": "offline", 00:17:04.537 "raid_level": "raid0", 00:17:04.537 "superblock": true, 00:17:04.537 "num_base_bdevs": 4, 00:17:04.537 "num_base_bdevs_discovered": 3, 00:17:04.537 "num_base_bdevs_operational": 3, 00:17:04.537 "base_bdevs_list": [ 00:17:04.537 { 00:17:04.537 "name": null, 00:17:04.537 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.537 "is_configured": false, 00:17:04.537 "data_offset": 2048, 00:17:04.537 "data_size": 63488 00:17:04.537 }, 00:17:04.537 { 00:17:04.537 "name": "BaseBdev2", 00:17:04.537 "uuid": "27a04355-5dbd-4900-a936-bf739fc2a8b8", 00:17:04.537 "is_configured": true, 00:17:04.537 "data_offset": 2048, 00:17:04.537 "data_size": 63488 00:17:04.537 }, 00:17:04.537 { 00:17:04.537 "name": "BaseBdev3", 00:17:04.537 "uuid": "001169f3-7852-4d3e-9d36-91db09c2d4ef", 00:17:04.537 "is_configured": true, 00:17:04.537 "data_offset": 2048, 00:17:04.537 "data_size": 63488 00:17:04.537 }, 00:17:04.537 { 00:17:04.537 "name": "BaseBdev4", 00:17:04.537 "uuid": "2def703a-c35a-4467-a97f-babc9dede28d", 00:17:04.537 "is_configured": true, 00:17:04.537 "data_offset": 2048, 00:17:04.537 "data_size": 63488 00:17:04.537 } 00:17:04.537 ] 00:17:04.537 }' 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.537 07:52:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:05.107 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:05.107 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:05.107 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.107 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:05.367 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:05.367 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:05.367 07:52:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:05.627 [2024-07-15 07:52:50.157039] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:05.627 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:05.627 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:05.627 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.627 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:05.627 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:05.627 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:05.627 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:05.887 [2024-07-15 07:52:50.547524] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:05.887 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:05.887 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:05.887 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.887 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:06.147 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:06.147 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:06.147 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:06.408 [2024-07-15 07:52:50.918245] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:06.408 [2024-07-15 07:52:50.918273] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2702fc0 name Existed_Raid, state offline 00:17:06.408 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:06.408 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:06.408 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.408 07:52:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:06.408 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:06.408 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:06.408 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:06.408 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:06.408 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:06.408 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:06.669 BaseBdev2 00:17:06.669 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:06.669 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:06.669 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:06.669 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:06.669 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:06.669 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:06.669 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:06.930 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:06.930 [ 00:17:06.930 { 00:17:06.930 "name": "BaseBdev2", 00:17:06.930 "aliases": [ 00:17:06.930 "8311307a-bef5-43d5-9d78-f8e190872ffc" 00:17:06.930 ], 00:17:06.930 "product_name": "Malloc disk", 00:17:06.930 "block_size": 512, 00:17:06.930 "num_blocks": 65536, 00:17:06.930 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:06.930 "assigned_rate_limits": { 00:17:06.930 "rw_ios_per_sec": 0, 00:17:06.930 "rw_mbytes_per_sec": 0, 00:17:06.930 "r_mbytes_per_sec": 0, 00:17:06.930 "w_mbytes_per_sec": 0 00:17:06.930 }, 00:17:06.930 "claimed": false, 00:17:06.930 "zoned": false, 00:17:06.930 "supported_io_types": { 00:17:06.930 "read": true, 00:17:06.930 "write": true, 00:17:06.930 "unmap": true, 00:17:06.930 "flush": true, 00:17:06.930 "reset": true, 00:17:06.930 "nvme_admin": false, 00:17:06.930 "nvme_io": false, 00:17:06.930 "nvme_io_md": false, 00:17:06.930 "write_zeroes": true, 00:17:06.930 "zcopy": true, 00:17:06.930 "get_zone_info": false, 00:17:06.930 "zone_management": false, 00:17:06.930 "zone_append": false, 00:17:06.930 "compare": false, 00:17:06.930 "compare_and_write": false, 00:17:06.930 "abort": true, 00:17:06.930 "seek_hole": false, 00:17:06.930 "seek_data": false, 00:17:06.930 "copy": true, 00:17:06.930 "nvme_iov_md": false 00:17:06.930 }, 00:17:06.930 "memory_domains": [ 00:17:06.930 { 00:17:06.930 "dma_device_id": "system", 00:17:06.930 "dma_device_type": 1 00:17:06.930 }, 00:17:06.930 { 00:17:06.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.930 "dma_device_type": 2 00:17:06.930 } 00:17:06.930 ], 00:17:06.930 "driver_specific": {} 00:17:06.930 } 00:17:06.930 ] 00:17:06.930 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:06.930 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:06.930 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:06.930 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:07.191 BaseBdev3 00:17:07.191 07:52:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:07.191 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:07.191 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:07.191 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:07.191 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:07.191 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:07.191 07:52:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.451 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:07.712 [ 00:17:07.712 { 00:17:07.712 "name": "BaseBdev3", 00:17:07.712 "aliases": [ 00:17:07.712 "2a48db46-99ad-4915-8ba9-186827e773fb" 00:17:07.712 ], 00:17:07.712 "product_name": "Malloc disk", 00:17:07.712 "block_size": 512, 00:17:07.712 "num_blocks": 65536, 00:17:07.712 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:07.712 "assigned_rate_limits": { 00:17:07.712 "rw_ios_per_sec": 0, 00:17:07.712 "rw_mbytes_per_sec": 0, 00:17:07.712 "r_mbytes_per_sec": 0, 00:17:07.712 "w_mbytes_per_sec": 0 00:17:07.712 }, 00:17:07.712 "claimed": false, 00:17:07.712 "zoned": false, 00:17:07.712 "supported_io_types": { 00:17:07.712 "read": true, 00:17:07.712 "write": true, 00:17:07.712 "unmap": true, 00:17:07.712 "flush": true, 00:17:07.712 "reset": true, 00:17:07.712 "nvme_admin": false, 00:17:07.712 "nvme_io": false, 00:17:07.712 "nvme_io_md": false, 00:17:07.712 "write_zeroes": true, 00:17:07.712 "zcopy": true, 00:17:07.712 "get_zone_info": false, 00:17:07.712 "zone_management": false, 00:17:07.712 "zone_append": false, 00:17:07.712 "compare": false, 00:17:07.712 "compare_and_write": false, 00:17:07.712 "abort": true, 00:17:07.712 "seek_hole": false, 00:17:07.712 "seek_data": false, 00:17:07.712 "copy": true, 00:17:07.712 "nvme_iov_md": false 00:17:07.712 }, 00:17:07.712 "memory_domains": [ 00:17:07.712 { 00:17:07.712 "dma_device_id": "system", 00:17:07.712 "dma_device_type": 1 00:17:07.712 }, 00:17:07.712 { 00:17:07.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.712 "dma_device_type": 2 00:17:07.712 } 00:17:07.712 ], 00:17:07.712 "driver_specific": {} 00:17:07.712 } 00:17:07.712 ] 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:07.712 BaseBdev4 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:07.712 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.972 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:08.233 [ 00:17:08.233 { 00:17:08.233 "name": "BaseBdev4", 00:17:08.233 "aliases": [ 00:17:08.233 "54dad984-83bc-48f1-b08f-27649573b992" 00:17:08.233 ], 00:17:08.233 "product_name": "Malloc disk", 00:17:08.233 "block_size": 512, 00:17:08.233 "num_blocks": 65536, 00:17:08.233 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:08.233 "assigned_rate_limits": { 00:17:08.233 "rw_ios_per_sec": 0, 00:17:08.233 "rw_mbytes_per_sec": 0, 00:17:08.233 "r_mbytes_per_sec": 0, 00:17:08.233 "w_mbytes_per_sec": 0 00:17:08.233 }, 00:17:08.233 "claimed": false, 00:17:08.233 "zoned": false, 00:17:08.233 "supported_io_types": { 00:17:08.233 "read": true, 00:17:08.233 "write": true, 00:17:08.233 "unmap": true, 00:17:08.233 "flush": true, 00:17:08.233 "reset": true, 00:17:08.233 "nvme_admin": false, 00:17:08.233 "nvme_io": false, 00:17:08.233 "nvme_io_md": false, 00:17:08.233 "write_zeroes": true, 00:17:08.233 "zcopy": true, 00:17:08.233 "get_zone_info": false, 00:17:08.233 "zone_management": false, 00:17:08.233 "zone_append": false, 00:17:08.233 "compare": false, 00:17:08.233 "compare_and_write": false, 00:17:08.233 "abort": true, 00:17:08.233 "seek_hole": false, 00:17:08.233 "seek_data": false, 00:17:08.233 "copy": true, 00:17:08.233 "nvme_iov_md": false 00:17:08.233 }, 00:17:08.233 "memory_domains": [ 00:17:08.233 { 00:17:08.233 "dma_device_id": "system", 00:17:08.233 "dma_device_type": 1 00:17:08.233 }, 00:17:08.233 { 00:17:08.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.233 "dma_device_type": 2 00:17:08.233 } 00:17:08.233 ], 00:17:08.233 "driver_specific": {} 00:17:08.233 } 00:17:08.233 ] 00:17:08.233 07:52:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:08.233 07:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:08.233 07:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:08.233 07:52:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:08.493 [2024-07-15 07:52:52.993363] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:08.493 [2024-07-15 07:52:52.993391] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:08.493 [2024-07-15 07:52:52.993403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:08.493 [2024-07-15 07:52:52.994438] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:08.493 [2024-07-15 07:52:52.994470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.493 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.493 "name": "Existed_Raid", 00:17:08.493 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:08.493 "strip_size_kb": 64, 00:17:08.493 "state": "configuring", 00:17:08.493 "raid_level": "raid0", 00:17:08.493 "superblock": true, 00:17:08.493 "num_base_bdevs": 4, 00:17:08.493 "num_base_bdevs_discovered": 3, 00:17:08.493 "num_base_bdevs_operational": 4, 00:17:08.493 "base_bdevs_list": [ 00:17:08.493 { 00:17:08.493 "name": "BaseBdev1", 00:17:08.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.493 "is_configured": false, 00:17:08.493 "data_offset": 0, 00:17:08.493 "data_size": 0 00:17:08.493 }, 00:17:08.493 { 00:17:08.493 "name": "BaseBdev2", 00:17:08.493 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:08.493 "is_configured": true, 00:17:08.493 "data_offset": 2048, 00:17:08.493 "data_size": 63488 00:17:08.493 }, 00:17:08.493 { 00:17:08.493 "name": "BaseBdev3", 00:17:08.493 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:08.493 "is_configured": true, 00:17:08.493 "data_offset": 2048, 00:17:08.493 "data_size": 63488 00:17:08.493 }, 00:17:08.493 { 00:17:08.493 "name": "BaseBdev4", 00:17:08.493 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:08.493 "is_configured": true, 00:17:08.493 "data_offset": 2048, 00:17:08.494 "data_size": 63488 00:17:08.494 } 00:17:08.494 ] 00:17:08.494 }' 00:17:08.494 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.494 07:52:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:09.063 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:09.322 [2024-07-15 07:52:53.911643] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.322 07:52:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:09.582 07:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.582 "name": "Existed_Raid", 00:17:09.582 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:09.582 "strip_size_kb": 64, 00:17:09.582 "state": "configuring", 00:17:09.582 "raid_level": "raid0", 00:17:09.582 "superblock": true, 00:17:09.582 "num_base_bdevs": 4, 00:17:09.582 "num_base_bdevs_discovered": 2, 00:17:09.582 "num_base_bdevs_operational": 4, 00:17:09.582 "base_bdevs_list": [ 00:17:09.582 { 00:17:09.582 "name": "BaseBdev1", 00:17:09.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.582 "is_configured": false, 00:17:09.582 "data_offset": 0, 00:17:09.582 "data_size": 0 00:17:09.582 }, 00:17:09.582 { 00:17:09.582 "name": null, 00:17:09.582 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:09.582 "is_configured": false, 00:17:09.582 "data_offset": 2048, 00:17:09.582 "data_size": 63488 00:17:09.582 }, 00:17:09.582 { 00:17:09.582 "name": "BaseBdev3", 00:17:09.582 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:09.582 "is_configured": true, 00:17:09.582 "data_offset": 2048, 00:17:09.582 "data_size": 63488 00:17:09.582 }, 00:17:09.582 { 00:17:09.582 "name": "BaseBdev4", 00:17:09.582 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:09.582 "is_configured": true, 00:17:09.582 "data_offset": 2048, 00:17:09.582 "data_size": 63488 00:17:09.582 } 00:17:09.582 ] 00:17:09.582 }' 00:17:09.582 07:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.582 07:52:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:10.150 07:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.150 07:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:10.150 07:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:10.150 07:52:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:10.440 [2024-07-15 07:52:55.015421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:10.440 BaseBdev1 00:17:10.440 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:10.440 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:10.440 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:10.440 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:10.440 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:10.440 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:10.440 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:10.700 [ 00:17:10.700 { 00:17:10.700 "name": "BaseBdev1", 00:17:10.700 "aliases": [ 00:17:10.700 "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1" 00:17:10.700 ], 00:17:10.700 "product_name": "Malloc disk", 00:17:10.700 "block_size": 512, 00:17:10.700 "num_blocks": 65536, 00:17:10.700 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:10.700 "assigned_rate_limits": { 00:17:10.700 "rw_ios_per_sec": 0, 00:17:10.700 "rw_mbytes_per_sec": 0, 00:17:10.700 "r_mbytes_per_sec": 0, 00:17:10.700 "w_mbytes_per_sec": 0 00:17:10.700 }, 00:17:10.700 "claimed": true, 00:17:10.700 "claim_type": "exclusive_write", 00:17:10.700 "zoned": false, 00:17:10.700 "supported_io_types": { 00:17:10.700 "read": true, 00:17:10.700 "write": true, 00:17:10.700 "unmap": true, 00:17:10.700 "flush": true, 00:17:10.700 "reset": true, 00:17:10.700 "nvme_admin": false, 00:17:10.700 "nvme_io": false, 00:17:10.700 "nvme_io_md": false, 00:17:10.700 "write_zeroes": true, 00:17:10.700 "zcopy": true, 00:17:10.700 "get_zone_info": false, 00:17:10.700 "zone_management": false, 00:17:10.700 "zone_append": false, 00:17:10.700 "compare": false, 00:17:10.700 "compare_and_write": false, 00:17:10.700 "abort": true, 00:17:10.700 "seek_hole": false, 00:17:10.700 "seek_data": false, 00:17:10.700 "copy": true, 00:17:10.700 "nvme_iov_md": false 00:17:10.700 }, 00:17:10.700 "memory_domains": [ 00:17:10.700 { 00:17:10.700 "dma_device_id": "system", 00:17:10.700 "dma_device_type": 1 00:17:10.700 }, 00:17:10.700 { 00:17:10.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.700 "dma_device_type": 2 00:17:10.700 } 00:17:10.700 ], 00:17:10.700 "driver_specific": {} 00:17:10.700 } 00:17:10.700 ] 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.700 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:10.961 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.961 "name": "Existed_Raid", 00:17:10.961 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:10.961 "strip_size_kb": 64, 00:17:10.961 "state": "configuring", 00:17:10.961 "raid_level": "raid0", 00:17:10.961 "superblock": true, 00:17:10.961 "num_base_bdevs": 4, 00:17:10.961 "num_base_bdevs_discovered": 3, 00:17:10.961 "num_base_bdevs_operational": 4, 00:17:10.961 "base_bdevs_list": [ 00:17:10.961 { 00:17:10.961 "name": "BaseBdev1", 00:17:10.961 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:10.961 "is_configured": true, 00:17:10.961 "data_offset": 2048, 00:17:10.961 "data_size": 63488 00:17:10.961 }, 00:17:10.961 { 00:17:10.961 "name": null, 00:17:10.961 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:10.961 "is_configured": false, 00:17:10.961 "data_offset": 2048, 00:17:10.961 "data_size": 63488 00:17:10.961 }, 00:17:10.961 { 00:17:10.961 "name": "BaseBdev3", 00:17:10.961 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:10.961 "is_configured": true, 00:17:10.961 "data_offset": 2048, 00:17:10.961 "data_size": 63488 00:17:10.961 }, 00:17:10.961 { 00:17:10.961 "name": "BaseBdev4", 00:17:10.961 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:10.961 "is_configured": true, 00:17:10.961 "data_offset": 2048, 00:17:10.961 "data_size": 63488 00:17:10.961 } 00:17:10.961 ] 00:17:10.961 }' 00:17:10.961 07:52:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.961 07:52:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:11.532 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.532 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:11.792 [2024-07-15 07:52:56.515241] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.792 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.052 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.052 "name": "Existed_Raid", 00:17:12.052 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:12.052 "strip_size_kb": 64, 00:17:12.052 "state": "configuring", 00:17:12.052 "raid_level": "raid0", 00:17:12.052 "superblock": true, 00:17:12.052 "num_base_bdevs": 4, 00:17:12.052 "num_base_bdevs_discovered": 2, 00:17:12.052 "num_base_bdevs_operational": 4, 00:17:12.052 "base_bdevs_list": [ 00:17:12.052 { 00:17:12.052 "name": "BaseBdev1", 00:17:12.052 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:12.052 "is_configured": true, 00:17:12.052 "data_offset": 2048, 00:17:12.052 "data_size": 63488 00:17:12.052 }, 00:17:12.052 { 00:17:12.052 "name": null, 00:17:12.052 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:12.052 "is_configured": false, 00:17:12.052 "data_offset": 2048, 00:17:12.052 "data_size": 63488 00:17:12.052 }, 00:17:12.052 { 00:17:12.052 "name": null, 00:17:12.052 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:12.052 "is_configured": false, 00:17:12.052 "data_offset": 2048, 00:17:12.052 "data_size": 63488 00:17:12.052 }, 00:17:12.052 { 00:17:12.052 "name": "BaseBdev4", 00:17:12.052 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:12.052 "is_configured": true, 00:17:12.052 "data_offset": 2048, 00:17:12.053 "data_size": 63488 00:17:12.053 } 00:17:12.053 ] 00:17:12.053 }' 00:17:12.053 07:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.053 07:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:12.622 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.622 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:12.882 [2024-07-15 07:52:57.618040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.882 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:13.142 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.142 "name": "Existed_Raid", 00:17:13.142 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:13.142 "strip_size_kb": 64, 00:17:13.142 "state": "configuring", 00:17:13.142 "raid_level": "raid0", 00:17:13.142 "superblock": true, 00:17:13.142 "num_base_bdevs": 4, 00:17:13.142 "num_base_bdevs_discovered": 3, 00:17:13.142 "num_base_bdevs_operational": 4, 00:17:13.142 "base_bdevs_list": [ 00:17:13.142 { 00:17:13.142 "name": "BaseBdev1", 00:17:13.142 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:13.142 "is_configured": true, 00:17:13.142 "data_offset": 2048, 00:17:13.142 "data_size": 63488 00:17:13.142 }, 00:17:13.142 { 00:17:13.142 "name": null, 00:17:13.142 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:13.142 "is_configured": false, 00:17:13.142 "data_offset": 2048, 00:17:13.142 "data_size": 63488 00:17:13.142 }, 00:17:13.142 { 00:17:13.142 "name": "BaseBdev3", 00:17:13.142 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:13.142 "is_configured": true, 00:17:13.142 "data_offset": 2048, 00:17:13.142 "data_size": 63488 00:17:13.142 }, 00:17:13.142 { 00:17:13.142 "name": "BaseBdev4", 00:17:13.142 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:13.142 "is_configured": true, 00:17:13.142 "data_offset": 2048, 00:17:13.142 "data_size": 63488 00:17:13.142 } 00:17:13.142 ] 00:17:13.142 }' 00:17:13.142 07:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.142 07:52:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:13.711 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.711 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:13.971 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:13.971 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:14.232 [2024-07-15 07:52:58.732876] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.232 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.232 "name": "Existed_Raid", 00:17:14.232 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:14.232 "strip_size_kb": 64, 00:17:14.232 "state": "configuring", 00:17:14.232 "raid_level": "raid0", 00:17:14.232 "superblock": true, 00:17:14.232 "num_base_bdevs": 4, 00:17:14.232 "num_base_bdevs_discovered": 2, 00:17:14.232 "num_base_bdevs_operational": 4, 00:17:14.232 "base_bdevs_list": [ 00:17:14.232 { 00:17:14.232 "name": null, 00:17:14.232 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:14.233 "is_configured": false, 00:17:14.233 "data_offset": 2048, 00:17:14.233 "data_size": 63488 00:17:14.233 }, 00:17:14.233 { 00:17:14.233 "name": null, 00:17:14.233 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:14.233 "is_configured": false, 00:17:14.233 "data_offset": 2048, 00:17:14.233 "data_size": 63488 00:17:14.233 }, 00:17:14.233 { 00:17:14.233 "name": "BaseBdev3", 00:17:14.233 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:14.233 "is_configured": true, 00:17:14.233 "data_offset": 2048, 00:17:14.233 "data_size": 63488 00:17:14.233 }, 00:17:14.233 { 00:17:14.233 "name": "BaseBdev4", 00:17:14.233 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:14.233 "is_configured": true, 00:17:14.233 "data_offset": 2048, 00:17:14.233 "data_size": 63488 00:17:14.233 } 00:17:14.233 ] 00:17:14.233 }' 00:17:14.233 07:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.233 07:52:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:14.803 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.803 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:15.063 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:15.063 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:15.323 [2024-07-15 07:52:59.889404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:15.323 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:15.323 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.323 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.323 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:15.323 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:15.324 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:15.324 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.324 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.324 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.324 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.324 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.324 07:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.583 07:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.583 "name": "Existed_Raid", 00:17:15.583 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:15.583 "strip_size_kb": 64, 00:17:15.583 "state": "configuring", 00:17:15.583 "raid_level": "raid0", 00:17:15.583 "superblock": true, 00:17:15.583 "num_base_bdevs": 4, 00:17:15.583 "num_base_bdevs_discovered": 3, 00:17:15.583 "num_base_bdevs_operational": 4, 00:17:15.583 "base_bdevs_list": [ 00:17:15.583 { 00:17:15.583 "name": null, 00:17:15.583 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:15.583 "is_configured": false, 00:17:15.583 "data_offset": 2048, 00:17:15.583 "data_size": 63488 00:17:15.583 }, 00:17:15.583 { 00:17:15.583 "name": "BaseBdev2", 00:17:15.583 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:15.583 "is_configured": true, 00:17:15.583 "data_offset": 2048, 00:17:15.583 "data_size": 63488 00:17:15.583 }, 00:17:15.583 { 00:17:15.583 "name": "BaseBdev3", 00:17:15.583 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:15.583 "is_configured": true, 00:17:15.583 "data_offset": 2048, 00:17:15.583 "data_size": 63488 00:17:15.583 }, 00:17:15.583 { 00:17:15.583 "name": "BaseBdev4", 00:17:15.583 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:15.583 "is_configured": true, 00:17:15.583 "data_offset": 2048, 00:17:15.583 "data_size": 63488 00:17:15.583 } 00:17:15.583 ] 00:17:15.583 }' 00:17:15.583 07:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.583 07:53:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:16.153 07:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.153 07:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:16.153 07:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:16.153 07:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:16.153 07:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.413 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2620f95c-d5e0-4947-8ec9-0a4a298b7cd1 00:17:16.671 [2024-07-15 07:53:01.241737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:16.672 [2024-07-15 07:53:01.241853] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2706c20 00:17:16.672 [2024-07-15 07:53:01.241861] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:16.672 [2024-07-15 07:53:01.242000] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26fa740 00:17:16.672 [2024-07-15 07:53:01.242087] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2706c20 00:17:16.672 [2024-07-15 07:53:01.242093] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2706c20 00:17:16.672 [2024-07-15 07:53:01.242158] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:16.672 NewBaseBdev 00:17:16.672 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:16.672 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:16.672 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:16.672 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:16.672 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:16.672 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:16.672 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:16.931 [ 00:17:16.931 { 00:17:16.931 "name": "NewBaseBdev", 00:17:16.931 "aliases": [ 00:17:16.931 "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1" 00:17:16.931 ], 00:17:16.931 "product_name": "Malloc disk", 00:17:16.931 "block_size": 512, 00:17:16.931 "num_blocks": 65536, 00:17:16.931 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:16.931 "assigned_rate_limits": { 00:17:16.931 "rw_ios_per_sec": 0, 00:17:16.931 "rw_mbytes_per_sec": 0, 00:17:16.931 "r_mbytes_per_sec": 0, 00:17:16.931 "w_mbytes_per_sec": 0 00:17:16.931 }, 00:17:16.931 "claimed": true, 00:17:16.931 "claim_type": "exclusive_write", 00:17:16.931 "zoned": false, 00:17:16.931 "supported_io_types": { 00:17:16.931 "read": true, 00:17:16.931 "write": true, 00:17:16.931 "unmap": true, 00:17:16.931 "flush": true, 00:17:16.931 "reset": true, 00:17:16.931 "nvme_admin": false, 00:17:16.931 "nvme_io": false, 00:17:16.931 "nvme_io_md": false, 00:17:16.931 "write_zeroes": true, 00:17:16.931 "zcopy": true, 00:17:16.931 "get_zone_info": false, 00:17:16.931 "zone_management": false, 00:17:16.931 "zone_append": false, 00:17:16.931 "compare": false, 00:17:16.931 "compare_and_write": false, 00:17:16.931 "abort": true, 00:17:16.931 "seek_hole": false, 00:17:16.931 "seek_data": false, 00:17:16.931 "copy": true, 00:17:16.931 "nvme_iov_md": false 00:17:16.931 }, 00:17:16.931 "memory_domains": [ 00:17:16.931 { 00:17:16.931 "dma_device_id": "system", 00:17:16.931 "dma_device_type": 1 00:17:16.931 }, 00:17:16.931 { 00:17:16.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.931 "dma_device_type": 2 00:17:16.931 } 00:17:16.931 ], 00:17:16.931 "driver_specific": {} 00:17:16.931 } 00:17:16.931 ] 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.931 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.190 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.190 "name": "Existed_Raid", 00:17:17.190 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:17.190 "strip_size_kb": 64, 00:17:17.190 "state": "online", 00:17:17.190 "raid_level": "raid0", 00:17:17.190 "superblock": true, 00:17:17.190 "num_base_bdevs": 4, 00:17:17.190 "num_base_bdevs_discovered": 4, 00:17:17.190 "num_base_bdevs_operational": 4, 00:17:17.190 "base_bdevs_list": [ 00:17:17.190 { 00:17:17.190 "name": "NewBaseBdev", 00:17:17.190 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:17.190 "is_configured": true, 00:17:17.190 "data_offset": 2048, 00:17:17.190 "data_size": 63488 00:17:17.190 }, 00:17:17.190 { 00:17:17.190 "name": "BaseBdev2", 00:17:17.190 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:17.190 "is_configured": true, 00:17:17.190 "data_offset": 2048, 00:17:17.190 "data_size": 63488 00:17:17.190 }, 00:17:17.190 { 00:17:17.190 "name": "BaseBdev3", 00:17:17.190 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:17.190 "is_configured": true, 00:17:17.190 "data_offset": 2048, 00:17:17.190 "data_size": 63488 00:17:17.190 }, 00:17:17.190 { 00:17:17.190 "name": "BaseBdev4", 00:17:17.190 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:17.190 "is_configured": true, 00:17:17.190 "data_offset": 2048, 00:17:17.190 "data_size": 63488 00:17:17.190 } 00:17:17.190 ] 00:17:17.190 }' 00:17:17.190 07:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.190 07:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:17.760 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:17.760 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:17.760 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:17.760 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:17.760 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:17.760 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:17.760 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:17.760 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:18.020 [2024-07-15 07:53:02.557319] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:18.020 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:18.020 "name": "Existed_Raid", 00:17:18.020 "aliases": [ 00:17:18.020 "8e4658c8-008e-48a5-b285-20df59222109" 00:17:18.020 ], 00:17:18.020 "product_name": "Raid Volume", 00:17:18.020 "block_size": 512, 00:17:18.020 "num_blocks": 253952, 00:17:18.020 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:18.020 "assigned_rate_limits": { 00:17:18.020 "rw_ios_per_sec": 0, 00:17:18.020 "rw_mbytes_per_sec": 0, 00:17:18.020 "r_mbytes_per_sec": 0, 00:17:18.020 "w_mbytes_per_sec": 0 00:17:18.020 }, 00:17:18.020 "claimed": false, 00:17:18.020 "zoned": false, 00:17:18.020 "supported_io_types": { 00:17:18.020 "read": true, 00:17:18.020 "write": true, 00:17:18.020 "unmap": true, 00:17:18.020 "flush": true, 00:17:18.020 "reset": true, 00:17:18.020 "nvme_admin": false, 00:17:18.020 "nvme_io": false, 00:17:18.020 "nvme_io_md": false, 00:17:18.020 "write_zeroes": true, 00:17:18.020 "zcopy": false, 00:17:18.020 "get_zone_info": false, 00:17:18.020 "zone_management": false, 00:17:18.020 "zone_append": false, 00:17:18.020 "compare": false, 00:17:18.020 "compare_and_write": false, 00:17:18.020 "abort": false, 00:17:18.020 "seek_hole": false, 00:17:18.020 "seek_data": false, 00:17:18.020 "copy": false, 00:17:18.020 "nvme_iov_md": false 00:17:18.020 }, 00:17:18.020 "memory_domains": [ 00:17:18.020 { 00:17:18.020 "dma_device_id": "system", 00:17:18.020 "dma_device_type": 1 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.020 "dma_device_type": 2 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "dma_device_id": "system", 00:17:18.020 "dma_device_type": 1 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.020 "dma_device_type": 2 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "dma_device_id": "system", 00:17:18.020 "dma_device_type": 1 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.020 "dma_device_type": 2 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "dma_device_id": "system", 00:17:18.020 "dma_device_type": 1 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.020 "dma_device_type": 2 00:17:18.020 } 00:17:18.020 ], 00:17:18.020 "driver_specific": { 00:17:18.020 "raid": { 00:17:18.020 "uuid": "8e4658c8-008e-48a5-b285-20df59222109", 00:17:18.020 "strip_size_kb": 64, 00:17:18.020 "state": "online", 00:17:18.020 "raid_level": "raid0", 00:17:18.020 "superblock": true, 00:17:18.020 "num_base_bdevs": 4, 00:17:18.020 "num_base_bdevs_discovered": 4, 00:17:18.020 "num_base_bdevs_operational": 4, 00:17:18.020 "base_bdevs_list": [ 00:17:18.020 { 00:17:18.020 "name": "NewBaseBdev", 00:17:18.020 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:18.020 "is_configured": true, 00:17:18.020 "data_offset": 2048, 00:17:18.020 "data_size": 63488 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "name": "BaseBdev2", 00:17:18.020 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:18.020 "is_configured": true, 00:17:18.020 "data_offset": 2048, 00:17:18.020 "data_size": 63488 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "name": "BaseBdev3", 00:17:18.020 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:18.020 "is_configured": true, 00:17:18.020 "data_offset": 2048, 00:17:18.020 "data_size": 63488 00:17:18.020 }, 00:17:18.020 { 00:17:18.020 "name": "BaseBdev4", 00:17:18.020 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:18.020 "is_configured": true, 00:17:18.020 "data_offset": 2048, 00:17:18.020 "data_size": 63488 00:17:18.020 } 00:17:18.020 ] 00:17:18.020 } 00:17:18.020 } 00:17:18.020 }' 00:17:18.020 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:18.020 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:18.020 BaseBdev2 00:17:18.020 BaseBdev3 00:17:18.020 BaseBdev4' 00:17:18.021 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:18.021 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:18.021 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:18.280 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:18.280 "name": "NewBaseBdev", 00:17:18.280 "aliases": [ 00:17:18.280 "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1" 00:17:18.280 ], 00:17:18.280 "product_name": "Malloc disk", 00:17:18.280 "block_size": 512, 00:17:18.280 "num_blocks": 65536, 00:17:18.280 "uuid": "2620f95c-d5e0-4947-8ec9-0a4a298b7cd1", 00:17:18.280 "assigned_rate_limits": { 00:17:18.280 "rw_ios_per_sec": 0, 00:17:18.280 "rw_mbytes_per_sec": 0, 00:17:18.280 "r_mbytes_per_sec": 0, 00:17:18.280 "w_mbytes_per_sec": 0 00:17:18.280 }, 00:17:18.280 "claimed": true, 00:17:18.280 "claim_type": "exclusive_write", 00:17:18.280 "zoned": false, 00:17:18.280 "supported_io_types": { 00:17:18.280 "read": true, 00:17:18.280 "write": true, 00:17:18.280 "unmap": true, 00:17:18.280 "flush": true, 00:17:18.280 "reset": true, 00:17:18.280 "nvme_admin": false, 00:17:18.280 "nvme_io": false, 00:17:18.280 "nvme_io_md": false, 00:17:18.280 "write_zeroes": true, 00:17:18.280 "zcopy": true, 00:17:18.280 "get_zone_info": false, 00:17:18.280 "zone_management": false, 00:17:18.280 "zone_append": false, 00:17:18.280 "compare": false, 00:17:18.280 "compare_and_write": false, 00:17:18.280 "abort": true, 00:17:18.280 "seek_hole": false, 00:17:18.280 "seek_data": false, 00:17:18.280 "copy": true, 00:17:18.280 "nvme_iov_md": false 00:17:18.280 }, 00:17:18.280 "memory_domains": [ 00:17:18.280 { 00:17:18.280 "dma_device_id": "system", 00:17:18.280 "dma_device_type": 1 00:17:18.280 }, 00:17:18.280 { 00:17:18.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.280 "dma_device_type": 2 00:17:18.280 } 00:17:18.280 ], 00:17:18.280 "driver_specific": {} 00:17:18.280 }' 00:17:18.280 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.280 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.280 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:18.280 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.280 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.280 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:18.280 07:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.539 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.539 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:18.539 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.539 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.539 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:18.540 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:18.540 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:18.540 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:18.798 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:18.798 "name": "BaseBdev2", 00:17:18.798 "aliases": [ 00:17:18.798 "8311307a-bef5-43d5-9d78-f8e190872ffc" 00:17:18.798 ], 00:17:18.798 "product_name": "Malloc disk", 00:17:18.798 "block_size": 512, 00:17:18.798 "num_blocks": 65536, 00:17:18.798 "uuid": "8311307a-bef5-43d5-9d78-f8e190872ffc", 00:17:18.798 "assigned_rate_limits": { 00:17:18.798 "rw_ios_per_sec": 0, 00:17:18.798 "rw_mbytes_per_sec": 0, 00:17:18.798 "r_mbytes_per_sec": 0, 00:17:18.798 "w_mbytes_per_sec": 0 00:17:18.798 }, 00:17:18.798 "claimed": true, 00:17:18.798 "claim_type": "exclusive_write", 00:17:18.798 "zoned": false, 00:17:18.798 "supported_io_types": { 00:17:18.798 "read": true, 00:17:18.798 "write": true, 00:17:18.798 "unmap": true, 00:17:18.798 "flush": true, 00:17:18.798 "reset": true, 00:17:18.798 "nvme_admin": false, 00:17:18.798 "nvme_io": false, 00:17:18.798 "nvme_io_md": false, 00:17:18.798 "write_zeroes": true, 00:17:18.798 "zcopy": true, 00:17:18.798 "get_zone_info": false, 00:17:18.798 "zone_management": false, 00:17:18.798 "zone_append": false, 00:17:18.798 "compare": false, 00:17:18.798 "compare_and_write": false, 00:17:18.798 "abort": true, 00:17:18.798 "seek_hole": false, 00:17:18.798 "seek_data": false, 00:17:18.798 "copy": true, 00:17:18.798 "nvme_iov_md": false 00:17:18.798 }, 00:17:18.798 "memory_domains": [ 00:17:18.798 { 00:17:18.798 "dma_device_id": "system", 00:17:18.798 "dma_device_type": 1 00:17:18.798 }, 00:17:18.798 { 00:17:18.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.798 "dma_device_type": 2 00:17:18.798 } 00:17:18.798 ], 00:17:18.798 "driver_specific": {} 00:17:18.798 }' 00:17:18.798 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.798 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.798 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:18.798 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.798 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.798 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:18.798 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.057 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.057 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:19.057 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.057 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.057 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:19.057 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:19.057 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:19.057 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:19.317 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:19.317 "name": "BaseBdev3", 00:17:19.317 "aliases": [ 00:17:19.317 "2a48db46-99ad-4915-8ba9-186827e773fb" 00:17:19.317 ], 00:17:19.317 "product_name": "Malloc disk", 00:17:19.317 "block_size": 512, 00:17:19.317 "num_blocks": 65536, 00:17:19.317 "uuid": "2a48db46-99ad-4915-8ba9-186827e773fb", 00:17:19.317 "assigned_rate_limits": { 00:17:19.317 "rw_ios_per_sec": 0, 00:17:19.317 "rw_mbytes_per_sec": 0, 00:17:19.317 "r_mbytes_per_sec": 0, 00:17:19.317 "w_mbytes_per_sec": 0 00:17:19.317 }, 00:17:19.317 "claimed": true, 00:17:19.317 "claim_type": "exclusive_write", 00:17:19.317 "zoned": false, 00:17:19.317 "supported_io_types": { 00:17:19.317 "read": true, 00:17:19.317 "write": true, 00:17:19.317 "unmap": true, 00:17:19.317 "flush": true, 00:17:19.317 "reset": true, 00:17:19.317 "nvme_admin": false, 00:17:19.317 "nvme_io": false, 00:17:19.317 "nvme_io_md": false, 00:17:19.317 "write_zeroes": true, 00:17:19.317 "zcopy": true, 00:17:19.317 "get_zone_info": false, 00:17:19.317 "zone_management": false, 00:17:19.317 "zone_append": false, 00:17:19.317 "compare": false, 00:17:19.317 "compare_and_write": false, 00:17:19.317 "abort": true, 00:17:19.317 "seek_hole": false, 00:17:19.317 "seek_data": false, 00:17:19.317 "copy": true, 00:17:19.317 "nvme_iov_md": false 00:17:19.317 }, 00:17:19.317 "memory_domains": [ 00:17:19.317 { 00:17:19.317 "dma_device_id": "system", 00:17:19.317 "dma_device_type": 1 00:17:19.317 }, 00:17:19.317 { 00:17:19.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.317 "dma_device_type": 2 00:17:19.317 } 00:17:19.317 ], 00:17:19.317 "driver_specific": {} 00:17:19.317 }' 00:17:19.317 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.317 07:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.317 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:19.317 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.317 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:19.576 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:19.836 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:19.836 "name": "BaseBdev4", 00:17:19.836 "aliases": [ 00:17:19.836 "54dad984-83bc-48f1-b08f-27649573b992" 00:17:19.836 ], 00:17:19.836 "product_name": "Malloc disk", 00:17:19.836 "block_size": 512, 00:17:19.836 "num_blocks": 65536, 00:17:19.836 "uuid": "54dad984-83bc-48f1-b08f-27649573b992", 00:17:19.836 "assigned_rate_limits": { 00:17:19.836 "rw_ios_per_sec": 0, 00:17:19.836 "rw_mbytes_per_sec": 0, 00:17:19.836 "r_mbytes_per_sec": 0, 00:17:19.836 "w_mbytes_per_sec": 0 00:17:19.836 }, 00:17:19.836 "claimed": true, 00:17:19.836 "claim_type": "exclusive_write", 00:17:19.836 "zoned": false, 00:17:19.836 "supported_io_types": { 00:17:19.836 "read": true, 00:17:19.836 "write": true, 00:17:19.836 "unmap": true, 00:17:19.836 "flush": true, 00:17:19.836 "reset": true, 00:17:19.836 "nvme_admin": false, 00:17:19.836 "nvme_io": false, 00:17:19.836 "nvme_io_md": false, 00:17:19.836 "write_zeroes": true, 00:17:19.836 "zcopy": true, 00:17:19.836 "get_zone_info": false, 00:17:19.836 "zone_management": false, 00:17:19.836 "zone_append": false, 00:17:19.836 "compare": false, 00:17:19.836 "compare_and_write": false, 00:17:19.836 "abort": true, 00:17:19.836 "seek_hole": false, 00:17:19.836 "seek_data": false, 00:17:19.836 "copy": true, 00:17:19.836 "nvme_iov_md": false 00:17:19.836 }, 00:17:19.836 "memory_domains": [ 00:17:19.836 { 00:17:19.836 "dma_device_id": "system", 00:17:19.836 "dma_device_type": 1 00:17:19.836 }, 00:17:19.836 { 00:17:19.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.836 "dma_device_type": 2 00:17:19.836 } 00:17:19.836 ], 00:17:19.836 "driver_specific": {} 00:17:19.836 }' 00:17:19.836 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.836 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:19.836 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:19.836 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:20.095 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:20.355 [2024-07-15 07:53:04.963152] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:20.355 [2024-07-15 07:53:04.963170] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:20.355 [2024-07-15 07:53:04.963205] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:20.355 [2024-07-15 07:53:04.963246] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:20.355 [2024-07-15 07:53:04.963252] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2706c20 name Existed_Raid, state offline 00:17:20.355 07:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1659812 00:17:20.355 07:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1659812 ']' 00:17:20.355 07:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1659812 00:17:20.355 07:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:20.355 07:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:20.355 07:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1659812 00:17:20.355 07:53:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:20.355 07:53:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:20.355 07:53:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1659812' 00:17:20.355 killing process with pid 1659812 00:17:20.355 07:53:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1659812 00:17:20.355 [2024-07-15 07:53:05.029899] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:20.355 07:53:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1659812 00:17:20.355 [2024-07-15 07:53:05.050246] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:20.616 07:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:20.616 00:17:20.616 real 0m27.562s 00:17:20.616 user 0m51.675s 00:17:20.616 sys 0m4.027s 00:17:20.616 07:53:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:20.616 07:53:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:20.616 ************************************ 00:17:20.616 END TEST raid_state_function_test_sb 00:17:20.616 ************************************ 00:17:20.616 07:53:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:20.616 07:53:05 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:17:20.616 07:53:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:20.616 07:53:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:20.616 07:53:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:20.616 ************************************ 00:17:20.616 START TEST raid_superblock_test 00:17:20.616 ************************************ 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1665069 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1665069 /var/tmp/spdk-raid.sock 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1665069 ']' 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:20.616 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:20.616 07:53:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.616 [2024-07-15 07:53:05.309313] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:17:20.616 [2024-07-15 07:53:05.309360] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1665069 ] 00:17:20.876 [2024-07-15 07:53:05.396464] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:20.876 [2024-07-15 07:53:05.462490] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:20.876 [2024-07-15 07:53:05.503864] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:20.876 [2024-07-15 07:53:05.503886] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:21.446 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:21.706 malloc1 00:17:21.706 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:21.966 [2024-07-15 07:53:06.510334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:21.966 [2024-07-15 07:53:06.510366] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:21.966 [2024-07-15 07:53:06.510377] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2697a20 00:17:21.966 [2024-07-15 07:53:06.510384] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:21.966 [2024-07-15 07:53:06.511664] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:21.966 [2024-07-15 07:53:06.511683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:21.966 pt1 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:21.966 malloc2 00:17:21.966 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:22.227 [2024-07-15 07:53:06.893357] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:22.227 [2024-07-15 07:53:06.893385] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:22.227 [2024-07-15 07:53:06.893396] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2698040 00:17:22.227 [2024-07-15 07:53:06.893402] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:22.227 [2024-07-15 07:53:06.894580] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:22.227 [2024-07-15 07:53:06.894599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:22.227 pt2 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:22.227 07:53:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:22.487 malloc3 00:17:22.487 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:22.747 [2024-07-15 07:53:07.260126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:22.747 [2024-07-15 07:53:07.260155] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:22.747 [2024-07-15 07:53:07.260165] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2698540 00:17:22.747 [2024-07-15 07:53:07.260171] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:22.747 [2024-07-15 07:53:07.261339] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:22.747 [2024-07-15 07:53:07.261356] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:22.747 pt3 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:17:22.747 malloc4 00:17:22.747 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:23.007 [2024-07-15 07:53:07.647037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:23.007 [2024-07-15 07:53:07.647063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:23.007 [2024-07-15 07:53:07.647072] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2845d60 00:17:23.007 [2024-07-15 07:53:07.647078] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:23.007 [2024-07-15 07:53:07.648250] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:23.007 [2024-07-15 07:53:07.648267] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:23.007 pt4 00:17:23.007 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:23.007 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:23.007 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:17:23.267 [2024-07-15 07:53:07.823494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:23.267 [2024-07-15 07:53:07.824478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:23.267 [2024-07-15 07:53:07.824517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:23.267 [2024-07-15 07:53:07.824549] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:23.267 [2024-07-15 07:53:07.824681] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2842e20 00:17:23.267 [2024-07-15 07:53:07.824688] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:23.267 [2024-07-15 07:53:07.824843] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2699000 00:17:23.267 [2024-07-15 07:53:07.824951] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2842e20 00:17:23.267 [2024-07-15 07:53:07.824960] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2842e20 00:17:23.267 [2024-07-15 07:53:07.825026] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.267 07:53:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:23.528 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.528 "name": "raid_bdev1", 00:17:23.528 "uuid": "bc1c37bb-35f6-4e64-ab05-ce7f82b51600", 00:17:23.528 "strip_size_kb": 64, 00:17:23.528 "state": "online", 00:17:23.528 "raid_level": "raid0", 00:17:23.528 "superblock": true, 00:17:23.528 "num_base_bdevs": 4, 00:17:23.528 "num_base_bdevs_discovered": 4, 00:17:23.528 "num_base_bdevs_operational": 4, 00:17:23.528 "base_bdevs_list": [ 00:17:23.528 { 00:17:23.528 "name": "pt1", 00:17:23.528 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:23.528 "is_configured": true, 00:17:23.528 "data_offset": 2048, 00:17:23.528 "data_size": 63488 00:17:23.528 }, 00:17:23.528 { 00:17:23.528 "name": "pt2", 00:17:23.528 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:23.528 "is_configured": true, 00:17:23.528 "data_offset": 2048, 00:17:23.528 "data_size": 63488 00:17:23.528 }, 00:17:23.528 { 00:17:23.528 "name": "pt3", 00:17:23.528 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:23.528 "is_configured": true, 00:17:23.528 "data_offset": 2048, 00:17:23.528 "data_size": 63488 00:17:23.528 }, 00:17:23.528 { 00:17:23.528 "name": "pt4", 00:17:23.528 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:23.528 "is_configured": true, 00:17:23.528 "data_offset": 2048, 00:17:23.528 "data_size": 63488 00:17:23.528 } 00:17:23.528 ] 00:17:23.528 }' 00:17:23.528 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.528 07:53:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.108 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:24.109 [2024-07-15 07:53:08.746059] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:24.109 "name": "raid_bdev1", 00:17:24.109 "aliases": [ 00:17:24.109 "bc1c37bb-35f6-4e64-ab05-ce7f82b51600" 00:17:24.109 ], 00:17:24.109 "product_name": "Raid Volume", 00:17:24.109 "block_size": 512, 00:17:24.109 "num_blocks": 253952, 00:17:24.109 "uuid": "bc1c37bb-35f6-4e64-ab05-ce7f82b51600", 00:17:24.109 "assigned_rate_limits": { 00:17:24.109 "rw_ios_per_sec": 0, 00:17:24.109 "rw_mbytes_per_sec": 0, 00:17:24.109 "r_mbytes_per_sec": 0, 00:17:24.109 "w_mbytes_per_sec": 0 00:17:24.109 }, 00:17:24.109 "claimed": false, 00:17:24.109 "zoned": false, 00:17:24.109 "supported_io_types": { 00:17:24.109 "read": true, 00:17:24.109 "write": true, 00:17:24.109 "unmap": true, 00:17:24.109 "flush": true, 00:17:24.109 "reset": true, 00:17:24.109 "nvme_admin": false, 00:17:24.109 "nvme_io": false, 00:17:24.109 "nvme_io_md": false, 00:17:24.109 "write_zeroes": true, 00:17:24.109 "zcopy": false, 00:17:24.109 "get_zone_info": false, 00:17:24.109 "zone_management": false, 00:17:24.109 "zone_append": false, 00:17:24.109 "compare": false, 00:17:24.109 "compare_and_write": false, 00:17:24.109 "abort": false, 00:17:24.109 "seek_hole": false, 00:17:24.109 "seek_data": false, 00:17:24.109 "copy": false, 00:17:24.109 "nvme_iov_md": false 00:17:24.109 }, 00:17:24.109 "memory_domains": [ 00:17:24.109 { 00:17:24.109 "dma_device_id": "system", 00:17:24.109 "dma_device_type": 1 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.109 "dma_device_type": 2 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "dma_device_id": "system", 00:17:24.109 "dma_device_type": 1 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.109 "dma_device_type": 2 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "dma_device_id": "system", 00:17:24.109 "dma_device_type": 1 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.109 "dma_device_type": 2 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "dma_device_id": "system", 00:17:24.109 "dma_device_type": 1 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.109 "dma_device_type": 2 00:17:24.109 } 00:17:24.109 ], 00:17:24.109 "driver_specific": { 00:17:24.109 "raid": { 00:17:24.109 "uuid": "bc1c37bb-35f6-4e64-ab05-ce7f82b51600", 00:17:24.109 "strip_size_kb": 64, 00:17:24.109 "state": "online", 00:17:24.109 "raid_level": "raid0", 00:17:24.109 "superblock": true, 00:17:24.109 "num_base_bdevs": 4, 00:17:24.109 "num_base_bdevs_discovered": 4, 00:17:24.109 "num_base_bdevs_operational": 4, 00:17:24.109 "base_bdevs_list": [ 00:17:24.109 { 00:17:24.109 "name": "pt1", 00:17:24.109 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:24.109 "is_configured": true, 00:17:24.109 "data_offset": 2048, 00:17:24.109 "data_size": 63488 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "name": "pt2", 00:17:24.109 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:24.109 "is_configured": true, 00:17:24.109 "data_offset": 2048, 00:17:24.109 "data_size": 63488 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "name": "pt3", 00:17:24.109 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:24.109 "is_configured": true, 00:17:24.109 "data_offset": 2048, 00:17:24.109 "data_size": 63488 00:17:24.109 }, 00:17:24.109 { 00:17:24.109 "name": "pt4", 00:17:24.109 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:24.109 "is_configured": true, 00:17:24.109 "data_offset": 2048, 00:17:24.109 "data_size": 63488 00:17:24.109 } 00:17:24.109 ] 00:17:24.109 } 00:17:24.109 } 00:17:24.109 }' 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:24.109 pt2 00:17:24.109 pt3 00:17:24.109 pt4' 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:24.109 07:53:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:24.403 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:24.403 "name": "pt1", 00:17:24.403 "aliases": [ 00:17:24.403 "00000000-0000-0000-0000-000000000001" 00:17:24.403 ], 00:17:24.403 "product_name": "passthru", 00:17:24.403 "block_size": 512, 00:17:24.403 "num_blocks": 65536, 00:17:24.403 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:24.403 "assigned_rate_limits": { 00:17:24.403 "rw_ios_per_sec": 0, 00:17:24.403 "rw_mbytes_per_sec": 0, 00:17:24.403 "r_mbytes_per_sec": 0, 00:17:24.403 "w_mbytes_per_sec": 0 00:17:24.403 }, 00:17:24.403 "claimed": true, 00:17:24.403 "claim_type": "exclusive_write", 00:17:24.403 "zoned": false, 00:17:24.403 "supported_io_types": { 00:17:24.403 "read": true, 00:17:24.403 "write": true, 00:17:24.403 "unmap": true, 00:17:24.403 "flush": true, 00:17:24.403 "reset": true, 00:17:24.403 "nvme_admin": false, 00:17:24.403 "nvme_io": false, 00:17:24.403 "nvme_io_md": false, 00:17:24.403 "write_zeroes": true, 00:17:24.403 "zcopy": true, 00:17:24.403 "get_zone_info": false, 00:17:24.403 "zone_management": false, 00:17:24.403 "zone_append": false, 00:17:24.403 "compare": false, 00:17:24.403 "compare_and_write": false, 00:17:24.403 "abort": true, 00:17:24.403 "seek_hole": false, 00:17:24.403 "seek_data": false, 00:17:24.403 "copy": true, 00:17:24.403 "nvme_iov_md": false 00:17:24.403 }, 00:17:24.403 "memory_domains": [ 00:17:24.403 { 00:17:24.403 "dma_device_id": "system", 00:17:24.403 "dma_device_type": 1 00:17:24.403 }, 00:17:24.403 { 00:17:24.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.403 "dma_device_type": 2 00:17:24.403 } 00:17:24.403 ], 00:17:24.403 "driver_specific": { 00:17:24.403 "passthru": { 00:17:24.403 "name": "pt1", 00:17:24.403 "base_bdev_name": "malloc1" 00:17:24.403 } 00:17:24.403 } 00:17:24.403 }' 00:17:24.403 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.403 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.403 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:24.403 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:24.691 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:24.950 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:24.950 "name": "pt2", 00:17:24.950 "aliases": [ 00:17:24.950 "00000000-0000-0000-0000-000000000002" 00:17:24.950 ], 00:17:24.950 "product_name": "passthru", 00:17:24.950 "block_size": 512, 00:17:24.950 "num_blocks": 65536, 00:17:24.950 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:24.950 "assigned_rate_limits": { 00:17:24.950 "rw_ios_per_sec": 0, 00:17:24.950 "rw_mbytes_per_sec": 0, 00:17:24.950 "r_mbytes_per_sec": 0, 00:17:24.950 "w_mbytes_per_sec": 0 00:17:24.950 }, 00:17:24.950 "claimed": true, 00:17:24.950 "claim_type": "exclusive_write", 00:17:24.950 "zoned": false, 00:17:24.950 "supported_io_types": { 00:17:24.950 "read": true, 00:17:24.950 "write": true, 00:17:24.950 "unmap": true, 00:17:24.950 "flush": true, 00:17:24.950 "reset": true, 00:17:24.950 "nvme_admin": false, 00:17:24.950 "nvme_io": false, 00:17:24.950 "nvme_io_md": false, 00:17:24.950 "write_zeroes": true, 00:17:24.950 "zcopy": true, 00:17:24.950 "get_zone_info": false, 00:17:24.950 "zone_management": false, 00:17:24.950 "zone_append": false, 00:17:24.950 "compare": false, 00:17:24.950 "compare_and_write": false, 00:17:24.950 "abort": true, 00:17:24.950 "seek_hole": false, 00:17:24.950 "seek_data": false, 00:17:24.950 "copy": true, 00:17:24.950 "nvme_iov_md": false 00:17:24.950 }, 00:17:24.950 "memory_domains": [ 00:17:24.950 { 00:17:24.950 "dma_device_id": "system", 00:17:24.950 "dma_device_type": 1 00:17:24.950 }, 00:17:24.950 { 00:17:24.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.950 "dma_device_type": 2 00:17:24.950 } 00:17:24.950 ], 00:17:24.950 "driver_specific": { 00:17:24.950 "passthru": { 00:17:24.950 "name": "pt2", 00:17:24.950 "base_bdev_name": "malloc2" 00:17:24.950 } 00:17:24.950 } 00:17:24.950 }' 00:17:24.950 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.950 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.950 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:24.950 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.950 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:25.210 07:53:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:25.469 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:25.469 "name": "pt3", 00:17:25.469 "aliases": [ 00:17:25.469 "00000000-0000-0000-0000-000000000003" 00:17:25.469 ], 00:17:25.469 "product_name": "passthru", 00:17:25.469 "block_size": 512, 00:17:25.469 "num_blocks": 65536, 00:17:25.469 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:25.469 "assigned_rate_limits": { 00:17:25.469 "rw_ios_per_sec": 0, 00:17:25.469 "rw_mbytes_per_sec": 0, 00:17:25.469 "r_mbytes_per_sec": 0, 00:17:25.469 "w_mbytes_per_sec": 0 00:17:25.469 }, 00:17:25.469 "claimed": true, 00:17:25.469 "claim_type": "exclusive_write", 00:17:25.469 "zoned": false, 00:17:25.469 "supported_io_types": { 00:17:25.469 "read": true, 00:17:25.469 "write": true, 00:17:25.469 "unmap": true, 00:17:25.469 "flush": true, 00:17:25.469 "reset": true, 00:17:25.469 "nvme_admin": false, 00:17:25.469 "nvme_io": false, 00:17:25.469 "nvme_io_md": false, 00:17:25.469 "write_zeroes": true, 00:17:25.469 "zcopy": true, 00:17:25.469 "get_zone_info": false, 00:17:25.469 "zone_management": false, 00:17:25.469 "zone_append": false, 00:17:25.469 "compare": false, 00:17:25.469 "compare_and_write": false, 00:17:25.469 "abort": true, 00:17:25.469 "seek_hole": false, 00:17:25.469 "seek_data": false, 00:17:25.469 "copy": true, 00:17:25.469 "nvme_iov_md": false 00:17:25.469 }, 00:17:25.469 "memory_domains": [ 00:17:25.469 { 00:17:25.469 "dma_device_id": "system", 00:17:25.469 "dma_device_type": 1 00:17:25.469 }, 00:17:25.469 { 00:17:25.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.469 "dma_device_type": 2 00:17:25.469 } 00:17:25.469 ], 00:17:25.469 "driver_specific": { 00:17:25.469 "passthru": { 00:17:25.469 "name": "pt3", 00:17:25.469 "base_bdev_name": "malloc3" 00:17:25.469 } 00:17:25.469 } 00:17:25.469 }' 00:17:25.469 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.469 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.469 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:25.469 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.729 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:25.730 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:25.990 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:25.990 "name": "pt4", 00:17:25.990 "aliases": [ 00:17:25.990 "00000000-0000-0000-0000-000000000004" 00:17:25.990 ], 00:17:25.990 "product_name": "passthru", 00:17:25.990 "block_size": 512, 00:17:25.990 "num_blocks": 65536, 00:17:25.990 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:25.990 "assigned_rate_limits": { 00:17:25.990 "rw_ios_per_sec": 0, 00:17:25.990 "rw_mbytes_per_sec": 0, 00:17:25.990 "r_mbytes_per_sec": 0, 00:17:25.990 "w_mbytes_per_sec": 0 00:17:25.990 }, 00:17:25.990 "claimed": true, 00:17:25.990 "claim_type": "exclusive_write", 00:17:25.990 "zoned": false, 00:17:25.990 "supported_io_types": { 00:17:25.990 "read": true, 00:17:25.990 "write": true, 00:17:25.990 "unmap": true, 00:17:25.990 "flush": true, 00:17:25.990 "reset": true, 00:17:25.990 "nvme_admin": false, 00:17:25.990 "nvme_io": false, 00:17:25.990 "nvme_io_md": false, 00:17:25.990 "write_zeroes": true, 00:17:25.990 "zcopy": true, 00:17:25.990 "get_zone_info": false, 00:17:25.990 "zone_management": false, 00:17:25.990 "zone_append": false, 00:17:25.990 "compare": false, 00:17:25.990 "compare_and_write": false, 00:17:25.990 "abort": true, 00:17:25.990 "seek_hole": false, 00:17:25.990 "seek_data": false, 00:17:25.990 "copy": true, 00:17:25.990 "nvme_iov_md": false 00:17:25.990 }, 00:17:25.990 "memory_domains": [ 00:17:25.990 { 00:17:25.990 "dma_device_id": "system", 00:17:25.990 "dma_device_type": 1 00:17:25.990 }, 00:17:25.990 { 00:17:25.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.990 "dma_device_type": 2 00:17:25.990 } 00:17:25.990 ], 00:17:25.990 "driver_specific": { 00:17:25.990 "passthru": { 00:17:25.990 "name": "pt4", 00:17:25.990 "base_bdev_name": "malloc4" 00:17:25.990 } 00:17:25.990 } 00:17:25.990 }' 00:17:25.990 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.990 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.990 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:25.990 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:26.250 07:53:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:26.509 [2024-07-15 07:53:11.152125] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:26.509 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=bc1c37bb-35f6-4e64-ab05-ce7f82b51600 00:17:26.509 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z bc1c37bb-35f6-4e64-ab05-ce7f82b51600 ']' 00:17:26.509 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:26.768 [2024-07-15 07:53:11.344362] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:26.768 [2024-07-15 07:53:11.344375] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:26.768 [2024-07-15 07:53:11.344410] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:26.768 [2024-07-15 07:53:11.344455] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:26.768 [2024-07-15 07:53:11.344461] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2842e20 name raid_bdev1, state offline 00:17:26.768 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.768 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:27.027 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:27.027 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:27.027 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:27.027 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:27.027 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:27.027 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:27.286 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:27.286 07:53:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:27.546 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:27.546 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:27.546 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:27.546 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:27.804 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:28.064 [2024-07-15 07:53:12.647613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:28.064 [2024-07-15 07:53:12.648673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:28.064 [2024-07-15 07:53:12.648705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:28.064 [2024-07-15 07:53:12.648735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:28.064 [2024-07-15 07:53:12.648769] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:28.064 [2024-07-15 07:53:12.648796] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:28.064 [2024-07-15 07:53:12.648810] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:28.064 [2024-07-15 07:53:12.648824] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:28.064 [2024-07-15 07:53:12.648833] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:28.064 [2024-07-15 07:53:12.648839] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2840c80 name raid_bdev1, state configuring 00:17:28.064 request: 00:17:28.064 { 00:17:28.064 "name": "raid_bdev1", 00:17:28.064 "raid_level": "raid0", 00:17:28.064 "base_bdevs": [ 00:17:28.064 "malloc1", 00:17:28.064 "malloc2", 00:17:28.064 "malloc3", 00:17:28.065 "malloc4" 00:17:28.065 ], 00:17:28.065 "strip_size_kb": 64, 00:17:28.065 "superblock": false, 00:17:28.065 "method": "bdev_raid_create", 00:17:28.065 "req_id": 1 00:17:28.065 } 00:17:28.065 Got JSON-RPC error response 00:17:28.065 response: 00:17:28.065 { 00:17:28.065 "code": -17, 00:17:28.065 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:28.065 } 00:17:28.065 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:28.065 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:28.065 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:28.065 07:53:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:28.065 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.065 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:28.324 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:28.324 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:28.324 07:53:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:28.324 [2024-07-15 07:53:13.032533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:28.324 [2024-07-15 07:53:13.032554] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:28.324 [2024-07-15 07:53:13.032564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2846a20 00:17:28.324 [2024-07-15 07:53:13.032570] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:28.324 [2024-07-15 07:53:13.033808] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:28.324 [2024-07-15 07:53:13.033825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:28.324 [2024-07-15 07:53:13.033869] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:28.324 [2024-07-15 07:53:13.033887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:28.324 pt1 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.324 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:28.584 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.584 "name": "raid_bdev1", 00:17:28.584 "uuid": "bc1c37bb-35f6-4e64-ab05-ce7f82b51600", 00:17:28.584 "strip_size_kb": 64, 00:17:28.584 "state": "configuring", 00:17:28.584 "raid_level": "raid0", 00:17:28.584 "superblock": true, 00:17:28.584 "num_base_bdevs": 4, 00:17:28.584 "num_base_bdevs_discovered": 1, 00:17:28.584 "num_base_bdevs_operational": 4, 00:17:28.584 "base_bdevs_list": [ 00:17:28.584 { 00:17:28.584 "name": "pt1", 00:17:28.584 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:28.584 "is_configured": true, 00:17:28.584 "data_offset": 2048, 00:17:28.584 "data_size": 63488 00:17:28.584 }, 00:17:28.584 { 00:17:28.584 "name": null, 00:17:28.584 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:28.584 "is_configured": false, 00:17:28.584 "data_offset": 2048, 00:17:28.584 "data_size": 63488 00:17:28.584 }, 00:17:28.584 { 00:17:28.584 "name": null, 00:17:28.584 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:28.584 "is_configured": false, 00:17:28.584 "data_offset": 2048, 00:17:28.584 "data_size": 63488 00:17:28.584 }, 00:17:28.584 { 00:17:28.584 "name": null, 00:17:28.584 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:28.584 "is_configured": false, 00:17:28.584 "data_offset": 2048, 00:17:28.584 "data_size": 63488 00:17:28.584 } 00:17:28.584 ] 00:17:28.584 }' 00:17:28.584 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.584 07:53:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.153 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:17:29.153 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:29.413 [2024-07-15 07:53:13.954865] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:29.413 [2024-07-15 07:53:13.954893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:29.413 [2024-07-15 07:53:13.954905] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2845b20 00:17:29.413 [2024-07-15 07:53:13.954911] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:29.413 [2024-07-15 07:53:13.955163] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:29.413 [2024-07-15 07:53:13.955173] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:29.413 [2024-07-15 07:53:13.955213] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:29.413 [2024-07-15 07:53:13.955225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:29.413 pt2 00:17:29.413 07:53:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:29.413 [2024-07-15 07:53:14.143399] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.413 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:29.673 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:29.673 "name": "raid_bdev1", 00:17:29.673 "uuid": "bc1c37bb-35f6-4e64-ab05-ce7f82b51600", 00:17:29.673 "strip_size_kb": 64, 00:17:29.673 "state": "configuring", 00:17:29.673 "raid_level": "raid0", 00:17:29.673 "superblock": true, 00:17:29.673 "num_base_bdevs": 4, 00:17:29.673 "num_base_bdevs_discovered": 1, 00:17:29.673 "num_base_bdevs_operational": 4, 00:17:29.673 "base_bdevs_list": [ 00:17:29.673 { 00:17:29.673 "name": "pt1", 00:17:29.673 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:29.673 "is_configured": true, 00:17:29.673 "data_offset": 2048, 00:17:29.673 "data_size": 63488 00:17:29.673 }, 00:17:29.673 { 00:17:29.673 "name": null, 00:17:29.673 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:29.673 "is_configured": false, 00:17:29.673 "data_offset": 2048, 00:17:29.673 "data_size": 63488 00:17:29.673 }, 00:17:29.673 { 00:17:29.673 "name": null, 00:17:29.673 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:29.673 "is_configured": false, 00:17:29.673 "data_offset": 2048, 00:17:29.673 "data_size": 63488 00:17:29.673 }, 00:17:29.673 { 00:17:29.673 "name": null, 00:17:29.673 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:29.673 "is_configured": false, 00:17:29.673 "data_offset": 2048, 00:17:29.673 "data_size": 63488 00:17:29.673 } 00:17:29.673 ] 00:17:29.673 }' 00:17:29.673 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:29.673 07:53:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.242 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:30.242 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:30.242 07:53:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:30.502 [2024-07-15 07:53:15.045685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:30.502 [2024-07-15 07:53:15.045719] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:30.502 [2024-07-15 07:53:15.045729] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2847570 00:17:30.502 [2024-07-15 07:53:15.045735] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:30.502 [2024-07-15 07:53:15.045987] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:30.502 [2024-07-15 07:53:15.045996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:30.502 [2024-07-15 07:53:15.046038] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:30.502 [2024-07-15 07:53:15.046049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:30.502 pt2 00:17:30.502 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:30.502 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:30.502 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:30.502 [2024-07-15 07:53:15.238172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:30.502 [2024-07-15 07:53:15.238191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:30.502 [2024-07-15 07:53:15.238199] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28420d0 00:17:30.502 [2024-07-15 07:53:15.238205] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:30.503 [2024-07-15 07:53:15.238418] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:30.503 [2024-07-15 07:53:15.238427] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:30.503 [2024-07-15 07:53:15.238460] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:30.503 [2024-07-15 07:53:15.238471] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:30.503 pt3 00:17:30.503 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:30.503 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:30.503 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:31.072 [2024-07-15 07:53:15.767513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:31.072 [2024-07-15 07:53:15.767539] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:31.072 [2024-07-15 07:53:15.767548] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2696f90 00:17:31.072 [2024-07-15 07:53:15.767554] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:31.072 [2024-07-15 07:53:15.767787] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:31.072 [2024-07-15 07:53:15.767797] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:31.072 [2024-07-15 07:53:15.767832] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:31.072 [2024-07-15 07:53:15.767844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:31.072 [2024-07-15 07:53:15.767940] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2847960 00:17:31.072 [2024-07-15 07:53:15.767946] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:31.072 [2024-07-15 07:53:15.768085] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2842a70 00:17:31.072 [2024-07-15 07:53:15.768185] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2847960 00:17:31.072 [2024-07-15 07:53:15.768190] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2847960 00:17:31.072 [2024-07-15 07:53:15.768261] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:31.072 pt4 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.072 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:31.339 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.339 "name": "raid_bdev1", 00:17:31.339 "uuid": "bc1c37bb-35f6-4e64-ab05-ce7f82b51600", 00:17:31.339 "strip_size_kb": 64, 00:17:31.339 "state": "online", 00:17:31.339 "raid_level": "raid0", 00:17:31.339 "superblock": true, 00:17:31.339 "num_base_bdevs": 4, 00:17:31.339 "num_base_bdevs_discovered": 4, 00:17:31.339 "num_base_bdevs_operational": 4, 00:17:31.339 "base_bdevs_list": [ 00:17:31.339 { 00:17:31.339 "name": "pt1", 00:17:31.339 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:31.339 "is_configured": true, 00:17:31.339 "data_offset": 2048, 00:17:31.339 "data_size": 63488 00:17:31.339 }, 00:17:31.339 { 00:17:31.339 "name": "pt2", 00:17:31.339 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:31.339 "is_configured": true, 00:17:31.339 "data_offset": 2048, 00:17:31.339 "data_size": 63488 00:17:31.339 }, 00:17:31.339 { 00:17:31.339 "name": "pt3", 00:17:31.339 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:31.339 "is_configured": true, 00:17:31.339 "data_offset": 2048, 00:17:31.339 "data_size": 63488 00:17:31.339 }, 00:17:31.339 { 00:17:31.339 "name": "pt4", 00:17:31.339 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:31.339 "is_configured": true, 00:17:31.339 "data_offset": 2048, 00:17:31.339 "data_size": 63488 00:17:31.339 } 00:17:31.339 ] 00:17:31.339 }' 00:17:31.339 07:53:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.339 07:53:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:32.277 07:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:32.277 07:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:32.277 07:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:32.277 07:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:32.277 07:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:32.277 07:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:32.277 07:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:32.277 07:53:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:32.537 [2024-07-15 07:53:17.063029] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:32.537 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:32.537 "name": "raid_bdev1", 00:17:32.537 "aliases": [ 00:17:32.537 "bc1c37bb-35f6-4e64-ab05-ce7f82b51600" 00:17:32.537 ], 00:17:32.537 "product_name": "Raid Volume", 00:17:32.537 "block_size": 512, 00:17:32.537 "num_blocks": 253952, 00:17:32.537 "uuid": "bc1c37bb-35f6-4e64-ab05-ce7f82b51600", 00:17:32.537 "assigned_rate_limits": { 00:17:32.537 "rw_ios_per_sec": 0, 00:17:32.537 "rw_mbytes_per_sec": 0, 00:17:32.537 "r_mbytes_per_sec": 0, 00:17:32.537 "w_mbytes_per_sec": 0 00:17:32.537 }, 00:17:32.537 "claimed": false, 00:17:32.537 "zoned": false, 00:17:32.537 "supported_io_types": { 00:17:32.537 "read": true, 00:17:32.537 "write": true, 00:17:32.537 "unmap": true, 00:17:32.537 "flush": true, 00:17:32.537 "reset": true, 00:17:32.537 "nvme_admin": false, 00:17:32.537 "nvme_io": false, 00:17:32.537 "nvme_io_md": false, 00:17:32.537 "write_zeroes": true, 00:17:32.537 "zcopy": false, 00:17:32.537 "get_zone_info": false, 00:17:32.537 "zone_management": false, 00:17:32.537 "zone_append": false, 00:17:32.537 "compare": false, 00:17:32.537 "compare_and_write": false, 00:17:32.537 "abort": false, 00:17:32.537 "seek_hole": false, 00:17:32.537 "seek_data": false, 00:17:32.537 "copy": false, 00:17:32.537 "nvme_iov_md": false 00:17:32.537 }, 00:17:32.537 "memory_domains": [ 00:17:32.537 { 00:17:32.537 "dma_device_id": "system", 00:17:32.537 "dma_device_type": 1 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.537 "dma_device_type": 2 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "dma_device_id": "system", 00:17:32.537 "dma_device_type": 1 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.537 "dma_device_type": 2 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "dma_device_id": "system", 00:17:32.537 "dma_device_type": 1 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.537 "dma_device_type": 2 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "dma_device_id": "system", 00:17:32.537 "dma_device_type": 1 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.537 "dma_device_type": 2 00:17:32.537 } 00:17:32.537 ], 00:17:32.537 "driver_specific": { 00:17:32.537 "raid": { 00:17:32.537 "uuid": "bc1c37bb-35f6-4e64-ab05-ce7f82b51600", 00:17:32.537 "strip_size_kb": 64, 00:17:32.537 "state": "online", 00:17:32.537 "raid_level": "raid0", 00:17:32.537 "superblock": true, 00:17:32.537 "num_base_bdevs": 4, 00:17:32.537 "num_base_bdevs_discovered": 4, 00:17:32.537 "num_base_bdevs_operational": 4, 00:17:32.537 "base_bdevs_list": [ 00:17:32.537 { 00:17:32.537 "name": "pt1", 00:17:32.537 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:32.537 "is_configured": true, 00:17:32.537 "data_offset": 2048, 00:17:32.537 "data_size": 63488 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "name": "pt2", 00:17:32.537 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:32.537 "is_configured": true, 00:17:32.537 "data_offset": 2048, 00:17:32.537 "data_size": 63488 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "name": "pt3", 00:17:32.537 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:32.537 "is_configured": true, 00:17:32.537 "data_offset": 2048, 00:17:32.537 "data_size": 63488 00:17:32.537 }, 00:17:32.537 { 00:17:32.537 "name": "pt4", 00:17:32.537 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:32.537 "is_configured": true, 00:17:32.537 "data_offset": 2048, 00:17:32.537 "data_size": 63488 00:17:32.537 } 00:17:32.537 ] 00:17:32.537 } 00:17:32.537 } 00:17:32.537 }' 00:17:32.537 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:32.537 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:32.537 pt2 00:17:32.537 pt3 00:17:32.537 pt4' 00:17:32.537 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.537 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:32.537 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.797 "name": "pt1", 00:17:32.797 "aliases": [ 00:17:32.797 "00000000-0000-0000-0000-000000000001" 00:17:32.797 ], 00:17:32.797 "product_name": "passthru", 00:17:32.797 "block_size": 512, 00:17:32.797 "num_blocks": 65536, 00:17:32.797 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:32.797 "assigned_rate_limits": { 00:17:32.797 "rw_ios_per_sec": 0, 00:17:32.797 "rw_mbytes_per_sec": 0, 00:17:32.797 "r_mbytes_per_sec": 0, 00:17:32.797 "w_mbytes_per_sec": 0 00:17:32.797 }, 00:17:32.797 "claimed": true, 00:17:32.797 "claim_type": "exclusive_write", 00:17:32.797 "zoned": false, 00:17:32.797 "supported_io_types": { 00:17:32.797 "read": true, 00:17:32.797 "write": true, 00:17:32.797 "unmap": true, 00:17:32.797 "flush": true, 00:17:32.797 "reset": true, 00:17:32.797 "nvme_admin": false, 00:17:32.797 "nvme_io": false, 00:17:32.797 "nvme_io_md": false, 00:17:32.797 "write_zeroes": true, 00:17:32.797 "zcopy": true, 00:17:32.797 "get_zone_info": false, 00:17:32.797 "zone_management": false, 00:17:32.797 "zone_append": false, 00:17:32.797 "compare": false, 00:17:32.797 "compare_and_write": false, 00:17:32.797 "abort": true, 00:17:32.797 "seek_hole": false, 00:17:32.797 "seek_data": false, 00:17:32.797 "copy": true, 00:17:32.797 "nvme_iov_md": false 00:17:32.797 }, 00:17:32.797 "memory_domains": [ 00:17:32.797 { 00:17:32.797 "dma_device_id": "system", 00:17:32.797 "dma_device_type": 1 00:17:32.797 }, 00:17:32.797 { 00:17:32.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.797 "dma_device_type": 2 00:17:32.797 } 00:17:32.797 ], 00:17:32.797 "driver_specific": { 00:17:32.797 "passthru": { 00:17:32.797 "name": "pt1", 00:17:32.797 "base_bdev_name": "malloc1" 00:17:32.797 } 00:17:32.797 } 00:17:32.797 }' 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.797 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.057 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.057 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.057 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.057 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.057 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:33.057 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:33.057 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:33.317 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:33.317 "name": "pt2", 00:17:33.317 "aliases": [ 00:17:33.317 "00000000-0000-0000-0000-000000000002" 00:17:33.317 ], 00:17:33.317 "product_name": "passthru", 00:17:33.317 "block_size": 512, 00:17:33.317 "num_blocks": 65536, 00:17:33.317 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:33.317 "assigned_rate_limits": { 00:17:33.317 "rw_ios_per_sec": 0, 00:17:33.317 "rw_mbytes_per_sec": 0, 00:17:33.317 "r_mbytes_per_sec": 0, 00:17:33.317 "w_mbytes_per_sec": 0 00:17:33.317 }, 00:17:33.317 "claimed": true, 00:17:33.317 "claim_type": "exclusive_write", 00:17:33.317 "zoned": false, 00:17:33.317 "supported_io_types": { 00:17:33.317 "read": true, 00:17:33.317 "write": true, 00:17:33.317 "unmap": true, 00:17:33.317 "flush": true, 00:17:33.317 "reset": true, 00:17:33.317 "nvme_admin": false, 00:17:33.317 "nvme_io": false, 00:17:33.317 "nvme_io_md": false, 00:17:33.317 "write_zeroes": true, 00:17:33.317 "zcopy": true, 00:17:33.317 "get_zone_info": false, 00:17:33.317 "zone_management": false, 00:17:33.317 "zone_append": false, 00:17:33.317 "compare": false, 00:17:33.317 "compare_and_write": false, 00:17:33.317 "abort": true, 00:17:33.317 "seek_hole": false, 00:17:33.317 "seek_data": false, 00:17:33.317 "copy": true, 00:17:33.317 "nvme_iov_md": false 00:17:33.317 }, 00:17:33.317 "memory_domains": [ 00:17:33.317 { 00:17:33.317 "dma_device_id": "system", 00:17:33.317 "dma_device_type": 1 00:17:33.317 }, 00:17:33.317 { 00:17:33.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.317 "dma_device_type": 2 00:17:33.317 } 00:17:33.317 ], 00:17:33.317 "driver_specific": { 00:17:33.317 "passthru": { 00:17:33.317 "name": "pt2", 00:17:33.317 "base_bdev_name": "malloc2" 00:17:33.317 } 00:17:33.317 } 00:17:33.317 }' 00:17:33.317 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.318 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.318 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:33.318 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.318 07:53:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.318 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:33.318 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.318 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.577 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.577 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.577 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.577 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.577 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:33.577 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:33.577 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:33.837 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:33.837 "name": "pt3", 00:17:33.837 "aliases": [ 00:17:33.837 "00000000-0000-0000-0000-000000000003" 00:17:33.837 ], 00:17:33.837 "product_name": "passthru", 00:17:33.837 "block_size": 512, 00:17:33.837 "num_blocks": 65536, 00:17:33.837 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:33.837 "assigned_rate_limits": { 00:17:33.837 "rw_ios_per_sec": 0, 00:17:33.837 "rw_mbytes_per_sec": 0, 00:17:33.837 "r_mbytes_per_sec": 0, 00:17:33.837 "w_mbytes_per_sec": 0 00:17:33.837 }, 00:17:33.837 "claimed": true, 00:17:33.837 "claim_type": "exclusive_write", 00:17:33.837 "zoned": false, 00:17:33.837 "supported_io_types": { 00:17:33.837 "read": true, 00:17:33.837 "write": true, 00:17:33.837 "unmap": true, 00:17:33.837 "flush": true, 00:17:33.837 "reset": true, 00:17:33.837 "nvme_admin": false, 00:17:33.837 "nvme_io": false, 00:17:33.837 "nvme_io_md": false, 00:17:33.837 "write_zeroes": true, 00:17:33.837 "zcopy": true, 00:17:33.837 "get_zone_info": false, 00:17:33.837 "zone_management": false, 00:17:33.837 "zone_append": false, 00:17:33.837 "compare": false, 00:17:33.837 "compare_and_write": false, 00:17:33.837 "abort": true, 00:17:33.837 "seek_hole": false, 00:17:33.837 "seek_data": false, 00:17:33.837 "copy": true, 00:17:33.837 "nvme_iov_md": false 00:17:33.837 }, 00:17:33.837 "memory_domains": [ 00:17:33.837 { 00:17:33.837 "dma_device_id": "system", 00:17:33.837 "dma_device_type": 1 00:17:33.837 }, 00:17:33.837 { 00:17:33.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.837 "dma_device_type": 2 00:17:33.837 } 00:17:33.837 ], 00:17:33.837 "driver_specific": { 00:17:33.837 "passthru": { 00:17:33.837 "name": "pt3", 00:17:33.837 "base_bdev_name": "malloc3" 00:17:33.837 } 00:17:33.837 } 00:17:33.837 }' 00:17:33.837 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.837 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.837 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:33.837 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.837 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.837 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:33.837 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.097 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.097 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:34.097 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.097 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.097 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:34.097 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:34.097 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:34.097 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:34.357 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:34.357 "name": "pt4", 00:17:34.357 "aliases": [ 00:17:34.357 "00000000-0000-0000-0000-000000000004" 00:17:34.357 ], 00:17:34.357 "product_name": "passthru", 00:17:34.357 "block_size": 512, 00:17:34.357 "num_blocks": 65536, 00:17:34.357 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:34.357 "assigned_rate_limits": { 00:17:34.357 "rw_ios_per_sec": 0, 00:17:34.357 "rw_mbytes_per_sec": 0, 00:17:34.357 "r_mbytes_per_sec": 0, 00:17:34.357 "w_mbytes_per_sec": 0 00:17:34.357 }, 00:17:34.357 "claimed": true, 00:17:34.357 "claim_type": "exclusive_write", 00:17:34.357 "zoned": false, 00:17:34.357 "supported_io_types": { 00:17:34.357 "read": true, 00:17:34.357 "write": true, 00:17:34.357 "unmap": true, 00:17:34.357 "flush": true, 00:17:34.357 "reset": true, 00:17:34.357 "nvme_admin": false, 00:17:34.357 "nvme_io": false, 00:17:34.357 "nvme_io_md": false, 00:17:34.357 "write_zeroes": true, 00:17:34.357 "zcopy": true, 00:17:34.357 "get_zone_info": false, 00:17:34.357 "zone_management": false, 00:17:34.357 "zone_append": false, 00:17:34.357 "compare": false, 00:17:34.357 "compare_and_write": false, 00:17:34.357 "abort": true, 00:17:34.357 "seek_hole": false, 00:17:34.357 "seek_data": false, 00:17:34.357 "copy": true, 00:17:34.357 "nvme_iov_md": false 00:17:34.357 }, 00:17:34.357 "memory_domains": [ 00:17:34.357 { 00:17:34.357 "dma_device_id": "system", 00:17:34.357 "dma_device_type": 1 00:17:34.357 }, 00:17:34.357 { 00:17:34.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:34.357 "dma_device_type": 2 00:17:34.357 } 00:17:34.357 ], 00:17:34.357 "driver_specific": { 00:17:34.357 "passthru": { 00:17:34.357 "name": "pt4", 00:17:34.357 "base_bdev_name": "malloc4" 00:17:34.357 } 00:17:34.357 } 00:17:34.357 }' 00:17:34.357 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:34.357 07:53:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:34.357 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:34.357 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.357 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:34.618 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:34.903 [2024-07-15 07:53:19.481139] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' bc1c37bb-35f6-4e64-ab05-ce7f82b51600 '!=' bc1c37bb-35f6-4e64-ab05-ce7f82b51600 ']' 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1665069 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1665069 ']' 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1665069 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1665069 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1665069' 00:17:34.903 killing process with pid 1665069 00:17:34.903 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1665069 00:17:34.903 [2024-07-15 07:53:19.534131] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:34.903 [2024-07-15 07:53:19.534172] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:34.904 [2024-07-15 07:53:19.534220] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:34.904 [2024-07-15 07:53:19.534227] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2847960 name raid_bdev1, state offline 00:17:34.904 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1665069 00:17:34.904 [2024-07-15 07:53:19.554683] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:35.164 07:53:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:35.164 00:17:35.164 real 0m14.424s 00:17:35.164 user 0m26.567s 00:17:35.164 sys 0m2.111s 00:17:35.164 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:35.164 07:53:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.164 ************************************ 00:17:35.164 END TEST raid_superblock_test 00:17:35.164 ************************************ 00:17:35.164 07:53:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:35.164 07:53:19 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:17:35.164 07:53:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:35.164 07:53:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:35.164 07:53:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:35.164 ************************************ 00:17:35.164 START TEST raid_read_error_test 00:17:35.164 ************************************ 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:35.164 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.gvJfw2SGJX 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1667719 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1667719 /var/tmp/spdk-raid.sock 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1667719 ']' 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:35.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:35.165 07:53:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.165 [2024-07-15 07:53:19.824771] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:17:35.165 [2024-07-15 07:53:19.824827] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1667719 ] 00:17:35.165 [2024-07-15 07:53:19.913588] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:35.424 [2024-07-15 07:53:19.982361] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:35.424 [2024-07-15 07:53:20.027745] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:35.424 [2024-07-15 07:53:20.027769] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:35.995 07:53:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:35.995 07:53:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:35.995 07:53:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:35.995 07:53:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:36.254 BaseBdev1_malloc 00:17:36.255 07:53:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:36.515 true 00:17:36.515 07:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:36.515 [2024-07-15 07:53:21.195404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:36.515 [2024-07-15 07:53:21.195434] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:36.515 [2024-07-15 07:53:21.195447] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x166ab50 00:17:36.515 [2024-07-15 07:53:21.195453] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:36.515 [2024-07-15 07:53:21.196749] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:36.515 [2024-07-15 07:53:21.196769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:36.515 BaseBdev1 00:17:36.515 07:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:36.515 07:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:36.775 BaseBdev2_malloc 00:17:36.775 07:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:37.036 true 00:17:37.036 07:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:37.036 [2024-07-15 07:53:21.766804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:37.036 [2024-07-15 07:53:21.766832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:37.036 [2024-07-15 07:53:21.766842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x164eea0 00:17:37.036 [2024-07-15 07:53:21.766848] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:37.036 [2024-07-15 07:53:21.768026] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:37.036 [2024-07-15 07:53:21.768045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:37.036 BaseBdev2 00:17:37.036 07:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:37.036 07:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:37.296 BaseBdev3_malloc 00:17:37.296 07:53:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:37.556 true 00:17:37.556 07:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:37.816 [2024-07-15 07:53:22.334166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:37.816 [2024-07-15 07:53:22.334193] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:37.816 [2024-07-15 07:53:22.334205] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1652fb0 00:17:37.816 [2024-07-15 07:53:22.334211] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:37.816 [2024-07-15 07:53:22.335396] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:37.816 [2024-07-15 07:53:22.335414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:37.816 BaseBdev3 00:17:37.816 07:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:37.816 07:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:37.816 BaseBdev4_malloc 00:17:37.816 07:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:38.076 true 00:17:38.076 07:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:38.338 [2024-07-15 07:53:22.889430] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:38.338 [2024-07-15 07:53:22.889457] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:38.338 [2024-07-15 07:53:22.889468] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1654980 00:17:38.338 [2024-07-15 07:53:22.889474] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:38.338 [2024-07-15 07:53:22.890659] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:38.338 [2024-07-15 07:53:22.890678] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:38.338 BaseBdev4 00:17:38.338 07:53:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:38.338 [2024-07-15 07:53:23.081944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:38.338 [2024-07-15 07:53:23.082962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:38.338 [2024-07-15 07:53:23.083013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:38.338 [2024-07-15 07:53:23.083063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:38.338 [2024-07-15 07:53:23.083236] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16544e0 00:17:38.338 [2024-07-15 07:53:23.083244] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:38.338 [2024-07-15 07:53:23.083389] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14b6210 00:17:38.338 [2024-07-15 07:53:23.083504] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16544e0 00:17:38.338 [2024-07-15 07:53:23.083509] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16544e0 00:17:38.338 [2024-07-15 07:53:23.083583] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.650 "name": "raid_bdev1", 00:17:38.650 "uuid": "4fe876a4-34f5-4ba8-b094-be3ffa6724ba", 00:17:38.650 "strip_size_kb": 64, 00:17:38.650 "state": "online", 00:17:38.650 "raid_level": "raid0", 00:17:38.650 "superblock": true, 00:17:38.650 "num_base_bdevs": 4, 00:17:38.650 "num_base_bdevs_discovered": 4, 00:17:38.650 "num_base_bdevs_operational": 4, 00:17:38.650 "base_bdevs_list": [ 00:17:38.650 { 00:17:38.650 "name": "BaseBdev1", 00:17:38.650 "uuid": "c4a598b5-2e4a-5128-82b2-b2e5e9b38363", 00:17:38.650 "is_configured": true, 00:17:38.650 "data_offset": 2048, 00:17:38.650 "data_size": 63488 00:17:38.650 }, 00:17:38.650 { 00:17:38.650 "name": "BaseBdev2", 00:17:38.650 "uuid": "ee7e3376-ba52-5af9-b663-12f18c10dd79", 00:17:38.650 "is_configured": true, 00:17:38.650 "data_offset": 2048, 00:17:38.650 "data_size": 63488 00:17:38.650 }, 00:17:38.650 { 00:17:38.650 "name": "BaseBdev3", 00:17:38.650 "uuid": "58c00508-f1a0-56c6-a055-2b4f00eee654", 00:17:38.650 "is_configured": true, 00:17:38.650 "data_offset": 2048, 00:17:38.650 "data_size": 63488 00:17:38.650 }, 00:17:38.650 { 00:17:38.650 "name": "BaseBdev4", 00:17:38.650 "uuid": "fc6229fa-f1c5-5e0f-96ac-b071ae5d2c56", 00:17:38.650 "is_configured": true, 00:17:38.650 "data_offset": 2048, 00:17:38.650 "data_size": 63488 00:17:38.650 } 00:17:38.650 ] 00:17:38.650 }' 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.650 07:53:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.271 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:39.271 07:53:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:39.271 [2024-07-15 07:53:23.992436] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14b6450 00:17:40.212 07:53:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.472 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:40.733 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.733 "name": "raid_bdev1", 00:17:40.733 "uuid": "4fe876a4-34f5-4ba8-b094-be3ffa6724ba", 00:17:40.733 "strip_size_kb": 64, 00:17:40.733 "state": "online", 00:17:40.733 "raid_level": "raid0", 00:17:40.733 "superblock": true, 00:17:40.733 "num_base_bdevs": 4, 00:17:40.733 "num_base_bdevs_discovered": 4, 00:17:40.733 "num_base_bdevs_operational": 4, 00:17:40.733 "base_bdevs_list": [ 00:17:40.733 { 00:17:40.733 "name": "BaseBdev1", 00:17:40.733 "uuid": "c4a598b5-2e4a-5128-82b2-b2e5e9b38363", 00:17:40.733 "is_configured": true, 00:17:40.733 "data_offset": 2048, 00:17:40.733 "data_size": 63488 00:17:40.733 }, 00:17:40.733 { 00:17:40.733 "name": "BaseBdev2", 00:17:40.733 "uuid": "ee7e3376-ba52-5af9-b663-12f18c10dd79", 00:17:40.733 "is_configured": true, 00:17:40.733 "data_offset": 2048, 00:17:40.733 "data_size": 63488 00:17:40.733 }, 00:17:40.733 { 00:17:40.733 "name": "BaseBdev3", 00:17:40.733 "uuid": "58c00508-f1a0-56c6-a055-2b4f00eee654", 00:17:40.733 "is_configured": true, 00:17:40.733 "data_offset": 2048, 00:17:40.733 "data_size": 63488 00:17:40.733 }, 00:17:40.733 { 00:17:40.733 "name": "BaseBdev4", 00:17:40.733 "uuid": "fc6229fa-f1c5-5e0f-96ac-b071ae5d2c56", 00:17:40.733 "is_configured": true, 00:17:40.733 "data_offset": 2048, 00:17:40.733 "data_size": 63488 00:17:40.733 } 00:17:40.733 ] 00:17:40.733 }' 00:17:40.733 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.733 07:53:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.304 07:53:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:41.304 [2024-07-15 07:53:26.040660] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:41.304 [2024-07-15 07:53:26.040693] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:41.304 [2024-07-15 07:53:26.043272] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:41.304 [2024-07-15 07:53:26.043300] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:41.304 [2024-07-15 07:53:26.043328] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:41.304 [2024-07-15 07:53:26.043334] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16544e0 name raid_bdev1, state offline 00:17:41.304 0 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1667719 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1667719 ']' 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1667719 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1667719 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1667719' 00:17:41.565 killing process with pid 1667719 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1667719 00:17:41.565 [2024-07-15 07:53:26.126037] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1667719 00:17:41.565 [2024-07-15 07:53:26.143070] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.gvJfw2SGJX 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:17:41.565 00:17:41.565 real 0m6.522s 00:17:41.565 user 0m10.546s 00:17:41.565 sys 0m0.926s 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:41.565 07:53:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.565 ************************************ 00:17:41.565 END TEST raid_read_error_test 00:17:41.565 ************************************ 00:17:41.565 07:53:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:41.565 07:53:26 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:17:41.565 07:53:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:41.565 07:53:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:41.565 07:53:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:41.827 ************************************ 00:17:41.827 START TEST raid_write_error_test 00:17:41.827 ************************************ 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.iuBpiDgJWF 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1668882 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1668882 /var/tmp/spdk-raid.sock 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1668882 ']' 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:41.827 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:41.827 07:53:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.827 [2024-07-15 07:53:26.427060] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:17:41.827 [2024-07-15 07:53:26.427116] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1668882 ] 00:17:41.827 [2024-07-15 07:53:26.515613] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:42.088 [2024-07-15 07:53:26.583648] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:42.088 [2024-07-15 07:53:26.626905] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:42.088 [2024-07-15 07:53:26.626927] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:42.659 07:53:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:42.659 07:53:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:42.659 07:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:42.659 07:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:42.919 BaseBdev1_malloc 00:17:42.919 07:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:42.919 true 00:17:42.919 07:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:43.179 [2024-07-15 07:53:27.809431] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:43.179 [2024-07-15 07:53:27.809464] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:43.179 [2024-07-15 07:53:27.809476] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2964b50 00:17:43.179 [2024-07-15 07:53:27.809482] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:43.179 [2024-07-15 07:53:27.810839] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:43.179 [2024-07-15 07:53:27.810858] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:43.179 BaseBdev1 00:17:43.179 07:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:43.179 07:53:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:43.439 BaseBdev2_malloc 00:17:43.439 07:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:43.439 true 00:17:43.699 07:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:43.699 [2024-07-15 07:53:28.376922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:43.699 [2024-07-15 07:53:28.376949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:43.699 [2024-07-15 07:53:28.376960] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2948ea0 00:17:43.699 [2024-07-15 07:53:28.376967] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:43.699 [2024-07-15 07:53:28.378183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:43.699 [2024-07-15 07:53:28.378201] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:43.699 BaseBdev2 00:17:43.699 07:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:43.699 07:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:43.959 BaseBdev3_malloc 00:17:43.960 07:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:44.220 true 00:17:44.220 07:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:44.220 [2024-07-15 07:53:28.944264] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:44.220 [2024-07-15 07:53:28.944292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:44.220 [2024-07-15 07:53:28.944305] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x294cfb0 00:17:44.220 [2024-07-15 07:53:28.944311] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:44.220 [2024-07-15 07:53:28.945535] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:44.220 [2024-07-15 07:53:28.945554] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:44.220 BaseBdev3 00:17:44.220 07:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:44.220 07:53:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:44.481 BaseBdev4_malloc 00:17:44.481 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:44.742 true 00:17:44.742 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:45.001 [2024-07-15 07:53:29.523603] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:45.001 [2024-07-15 07:53:29.523631] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.001 [2024-07-15 07:53:29.523643] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x294e980 00:17:45.001 [2024-07-15 07:53:29.523649] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.001 [2024-07-15 07:53:29.524865] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.002 [2024-07-15 07:53:29.524883] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:45.002 BaseBdev4 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:45.002 [2024-07-15 07:53:29.700073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:45.002 [2024-07-15 07:53:29.701071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:45.002 [2024-07-15 07:53:29.701122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:45.002 [2024-07-15 07:53:29.701166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:45.002 [2024-07-15 07:53:29.701345] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x294e4e0 00:17:45.002 [2024-07-15 07:53:29.701352] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:45.002 [2024-07-15 07:53:29.701494] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27b0210 00:17:45.002 [2024-07-15 07:53:29.701608] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x294e4e0 00:17:45.002 [2024-07-15 07:53:29.701613] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x294e4e0 00:17:45.002 [2024-07-15 07:53:29.701686] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.002 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:45.262 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.262 "name": "raid_bdev1", 00:17:45.262 "uuid": "1843a746-ee1f-450b-9a51-1535b6dc92fe", 00:17:45.262 "strip_size_kb": 64, 00:17:45.262 "state": "online", 00:17:45.262 "raid_level": "raid0", 00:17:45.262 "superblock": true, 00:17:45.262 "num_base_bdevs": 4, 00:17:45.262 "num_base_bdevs_discovered": 4, 00:17:45.262 "num_base_bdevs_operational": 4, 00:17:45.262 "base_bdevs_list": [ 00:17:45.262 { 00:17:45.262 "name": "BaseBdev1", 00:17:45.262 "uuid": "9b9f3521-40ec-5197-ac4b-ac83ff76b3d6", 00:17:45.262 "is_configured": true, 00:17:45.262 "data_offset": 2048, 00:17:45.262 "data_size": 63488 00:17:45.262 }, 00:17:45.262 { 00:17:45.262 "name": "BaseBdev2", 00:17:45.262 "uuid": "b4e64641-44c2-5873-bbaf-92e8105b23cf", 00:17:45.262 "is_configured": true, 00:17:45.262 "data_offset": 2048, 00:17:45.262 "data_size": 63488 00:17:45.262 }, 00:17:45.262 { 00:17:45.262 "name": "BaseBdev3", 00:17:45.262 "uuid": "e2cdad18-b68d-5e7d-a884-8adefcfb0e6a", 00:17:45.262 "is_configured": true, 00:17:45.262 "data_offset": 2048, 00:17:45.262 "data_size": 63488 00:17:45.262 }, 00:17:45.262 { 00:17:45.262 "name": "BaseBdev4", 00:17:45.262 "uuid": "370e6dff-1b7e-555a-8c2d-39b2161fbc9e", 00:17:45.262 "is_configured": true, 00:17:45.262 "data_offset": 2048, 00:17:45.262 "data_size": 63488 00:17:45.262 } 00:17:45.262 ] 00:17:45.262 }' 00:17:45.262 07:53:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.262 07:53:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.833 07:53:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:45.833 07:53:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:45.833 [2024-07-15 07:53:30.534392] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27b0450 00:17:46.775 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.035 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:47.296 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:47.296 "name": "raid_bdev1", 00:17:47.296 "uuid": "1843a746-ee1f-450b-9a51-1535b6dc92fe", 00:17:47.296 "strip_size_kb": 64, 00:17:47.296 "state": "online", 00:17:47.296 "raid_level": "raid0", 00:17:47.296 "superblock": true, 00:17:47.296 "num_base_bdevs": 4, 00:17:47.296 "num_base_bdevs_discovered": 4, 00:17:47.296 "num_base_bdevs_operational": 4, 00:17:47.296 "base_bdevs_list": [ 00:17:47.296 { 00:17:47.296 "name": "BaseBdev1", 00:17:47.296 "uuid": "9b9f3521-40ec-5197-ac4b-ac83ff76b3d6", 00:17:47.296 "is_configured": true, 00:17:47.296 "data_offset": 2048, 00:17:47.296 "data_size": 63488 00:17:47.296 }, 00:17:47.296 { 00:17:47.296 "name": "BaseBdev2", 00:17:47.296 "uuid": "b4e64641-44c2-5873-bbaf-92e8105b23cf", 00:17:47.296 "is_configured": true, 00:17:47.296 "data_offset": 2048, 00:17:47.296 "data_size": 63488 00:17:47.296 }, 00:17:47.296 { 00:17:47.296 "name": "BaseBdev3", 00:17:47.296 "uuid": "e2cdad18-b68d-5e7d-a884-8adefcfb0e6a", 00:17:47.296 "is_configured": true, 00:17:47.296 "data_offset": 2048, 00:17:47.296 "data_size": 63488 00:17:47.296 }, 00:17:47.296 { 00:17:47.296 "name": "BaseBdev4", 00:17:47.296 "uuid": "370e6dff-1b7e-555a-8c2d-39b2161fbc9e", 00:17:47.296 "is_configured": true, 00:17:47.296 "data_offset": 2048, 00:17:47.296 "data_size": 63488 00:17:47.296 } 00:17:47.296 ] 00:17:47.296 }' 00:17:47.296 07:53:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:47.296 07:53:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.868 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:47.868 [2024-07-15 07:53:32.558315] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:47.868 [2024-07-15 07:53:32.558338] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:47.868 [2024-07-15 07:53:32.560921] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:47.868 [2024-07-15 07:53:32.560950] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:47.868 [2024-07-15 07:53:32.560978] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:47.868 [2024-07-15 07:53:32.560984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x294e4e0 name raid_bdev1, state offline 00:17:47.868 0 00:17:47.868 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1668882 00:17:47.868 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1668882 ']' 00:17:47.868 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1668882 00:17:47.868 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:47.868 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:47.868 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1668882 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1668882' 00:17:48.129 killing process with pid 1668882 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1668882 00:17:48.129 [2024-07-15 07:53:32.627442] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1668882 00:17:48.129 [2024-07-15 07:53:32.644563] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.iuBpiDgJWF 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:48.129 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:48.130 07:53:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:17:48.130 00:17:48.130 real 0m6.421s 00:17:48.130 user 0m10.379s 00:17:48.130 sys 0m0.857s 00:17:48.130 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:48.130 07:53:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.130 ************************************ 00:17:48.130 END TEST raid_write_error_test 00:17:48.130 ************************************ 00:17:48.130 07:53:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:48.130 07:53:32 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:48.130 07:53:32 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:17:48.130 07:53:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:48.130 07:53:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:48.130 07:53:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:48.130 ************************************ 00:17:48.130 START TEST raid_state_function_test 00:17:48.130 ************************************ 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1670166 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1670166' 00:17:48.130 Process raid pid: 1670166 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1670166 /var/tmp/spdk-raid.sock 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1670166 ']' 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:48.130 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:48.130 07:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.391 [2024-07-15 07:53:32.915568] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:17:48.391 [2024-07-15 07:53:32.915623] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:48.391 [2024-07-15 07:53:33.005035] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:48.391 [2024-07-15 07:53:33.072590] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:48.391 [2024-07-15 07:53:33.111529] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:48.391 [2024-07-15 07:53:33.111551] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:49.330 07:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:49.330 07:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:49.330 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:49.330 [2024-07-15 07:53:33.910510] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:49.330 [2024-07-15 07:53:33.910535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:49.331 [2024-07-15 07:53:33.910541] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:49.331 [2024-07-15 07:53:33.910547] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:49.331 [2024-07-15 07:53:33.910552] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:49.331 [2024-07-15 07:53:33.910557] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:49.331 [2024-07-15 07:53:33.910562] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:49.331 [2024-07-15 07:53:33.910567] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.331 07:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:49.591 07:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.591 "name": "Existed_Raid", 00:17:49.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.591 "strip_size_kb": 64, 00:17:49.591 "state": "configuring", 00:17:49.591 "raid_level": "concat", 00:17:49.591 "superblock": false, 00:17:49.591 "num_base_bdevs": 4, 00:17:49.591 "num_base_bdevs_discovered": 0, 00:17:49.591 "num_base_bdevs_operational": 4, 00:17:49.591 "base_bdevs_list": [ 00:17:49.591 { 00:17:49.591 "name": "BaseBdev1", 00:17:49.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.591 "is_configured": false, 00:17:49.591 "data_offset": 0, 00:17:49.591 "data_size": 0 00:17:49.591 }, 00:17:49.591 { 00:17:49.591 "name": "BaseBdev2", 00:17:49.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.591 "is_configured": false, 00:17:49.591 "data_offset": 0, 00:17:49.591 "data_size": 0 00:17:49.591 }, 00:17:49.591 { 00:17:49.591 "name": "BaseBdev3", 00:17:49.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.591 "is_configured": false, 00:17:49.591 "data_offset": 0, 00:17:49.591 "data_size": 0 00:17:49.591 }, 00:17:49.591 { 00:17:49.591 "name": "BaseBdev4", 00:17:49.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.591 "is_configured": false, 00:17:49.591 "data_offset": 0, 00:17:49.591 "data_size": 0 00:17:49.591 } 00:17:49.591 ] 00:17:49.591 }' 00:17:49.591 07:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.591 07:53:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.853 07:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:50.113 [2024-07-15 07:53:34.768569] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:50.113 [2024-07-15 07:53:34.768586] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27936f0 name Existed_Raid, state configuring 00:17:50.113 07:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:50.375 [2024-07-15 07:53:34.957071] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:50.375 [2024-07-15 07:53:34.957088] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:50.375 [2024-07-15 07:53:34.957093] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:50.375 [2024-07-15 07:53:34.957099] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:50.375 [2024-07-15 07:53:34.957103] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:50.375 [2024-07-15 07:53:34.957109] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:50.375 [2024-07-15 07:53:34.957113] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:50.375 [2024-07-15 07:53:34.957118] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:50.375 07:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:50.636 [2024-07-15 07:53:35.156295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:50.636 BaseBdev1 00:17:50.636 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:50.636 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:50.636 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:50.636 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:50.636 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:50.636 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:50.636 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:50.636 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:50.896 [ 00:17:50.896 { 00:17:50.896 "name": "BaseBdev1", 00:17:50.896 "aliases": [ 00:17:50.896 "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875" 00:17:50.896 ], 00:17:50.896 "product_name": "Malloc disk", 00:17:50.896 "block_size": 512, 00:17:50.896 "num_blocks": 65536, 00:17:50.896 "uuid": "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875", 00:17:50.896 "assigned_rate_limits": { 00:17:50.896 "rw_ios_per_sec": 0, 00:17:50.896 "rw_mbytes_per_sec": 0, 00:17:50.896 "r_mbytes_per_sec": 0, 00:17:50.896 "w_mbytes_per_sec": 0 00:17:50.896 }, 00:17:50.896 "claimed": true, 00:17:50.896 "claim_type": "exclusive_write", 00:17:50.896 "zoned": false, 00:17:50.896 "supported_io_types": { 00:17:50.896 "read": true, 00:17:50.896 "write": true, 00:17:50.896 "unmap": true, 00:17:50.897 "flush": true, 00:17:50.897 "reset": true, 00:17:50.897 "nvme_admin": false, 00:17:50.897 "nvme_io": false, 00:17:50.897 "nvme_io_md": false, 00:17:50.897 "write_zeroes": true, 00:17:50.897 "zcopy": true, 00:17:50.897 "get_zone_info": false, 00:17:50.897 "zone_management": false, 00:17:50.897 "zone_append": false, 00:17:50.897 "compare": false, 00:17:50.897 "compare_and_write": false, 00:17:50.897 "abort": true, 00:17:50.897 "seek_hole": false, 00:17:50.897 "seek_data": false, 00:17:50.897 "copy": true, 00:17:50.897 "nvme_iov_md": false 00:17:50.897 }, 00:17:50.897 "memory_domains": [ 00:17:50.897 { 00:17:50.897 "dma_device_id": "system", 00:17:50.897 "dma_device_type": 1 00:17:50.897 }, 00:17:50.897 { 00:17:50.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.897 "dma_device_type": 2 00:17:50.897 } 00:17:50.897 ], 00:17:50.897 "driver_specific": {} 00:17:50.897 } 00:17:50.897 ] 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.897 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:51.157 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.157 "name": "Existed_Raid", 00:17:51.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.157 "strip_size_kb": 64, 00:17:51.157 "state": "configuring", 00:17:51.157 "raid_level": "concat", 00:17:51.157 "superblock": false, 00:17:51.157 "num_base_bdevs": 4, 00:17:51.157 "num_base_bdevs_discovered": 1, 00:17:51.157 "num_base_bdevs_operational": 4, 00:17:51.157 "base_bdevs_list": [ 00:17:51.157 { 00:17:51.157 "name": "BaseBdev1", 00:17:51.157 "uuid": "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875", 00:17:51.157 "is_configured": true, 00:17:51.157 "data_offset": 0, 00:17:51.157 "data_size": 65536 00:17:51.157 }, 00:17:51.157 { 00:17:51.157 "name": "BaseBdev2", 00:17:51.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.157 "is_configured": false, 00:17:51.157 "data_offset": 0, 00:17:51.157 "data_size": 0 00:17:51.157 }, 00:17:51.157 { 00:17:51.157 "name": "BaseBdev3", 00:17:51.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.157 "is_configured": false, 00:17:51.157 "data_offset": 0, 00:17:51.157 "data_size": 0 00:17:51.157 }, 00:17:51.157 { 00:17:51.157 "name": "BaseBdev4", 00:17:51.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.157 "is_configured": false, 00:17:51.157 "data_offset": 0, 00:17:51.157 "data_size": 0 00:17:51.157 } 00:17:51.157 ] 00:17:51.157 }' 00:17:51.157 07:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.157 07:53:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.728 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:51.728 [2024-07-15 07:53:36.459580] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:51.728 [2024-07-15 07:53:36.459606] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2792f60 name Existed_Raid, state configuring 00:17:51.728 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:51.988 [2024-07-15 07:53:36.648087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:51.988 [2024-07-15 07:53:36.649190] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:51.988 [2024-07-15 07:53:36.649213] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:51.988 [2024-07-15 07:53:36.649219] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:51.988 [2024-07-15 07:53:36.649225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:51.988 [2024-07-15 07:53:36.649229] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:51.989 [2024-07-15 07:53:36.649235] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.989 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.249 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.249 "name": "Existed_Raid", 00:17:52.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.249 "strip_size_kb": 64, 00:17:52.249 "state": "configuring", 00:17:52.249 "raid_level": "concat", 00:17:52.249 "superblock": false, 00:17:52.249 "num_base_bdevs": 4, 00:17:52.249 "num_base_bdevs_discovered": 1, 00:17:52.249 "num_base_bdevs_operational": 4, 00:17:52.249 "base_bdevs_list": [ 00:17:52.249 { 00:17:52.249 "name": "BaseBdev1", 00:17:52.249 "uuid": "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875", 00:17:52.249 "is_configured": true, 00:17:52.249 "data_offset": 0, 00:17:52.249 "data_size": 65536 00:17:52.249 }, 00:17:52.249 { 00:17:52.249 "name": "BaseBdev2", 00:17:52.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.249 "is_configured": false, 00:17:52.249 "data_offset": 0, 00:17:52.249 "data_size": 0 00:17:52.249 }, 00:17:52.249 { 00:17:52.249 "name": "BaseBdev3", 00:17:52.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.249 "is_configured": false, 00:17:52.249 "data_offset": 0, 00:17:52.249 "data_size": 0 00:17:52.249 }, 00:17:52.249 { 00:17:52.249 "name": "BaseBdev4", 00:17:52.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.249 "is_configured": false, 00:17:52.249 "data_offset": 0, 00:17:52.249 "data_size": 0 00:17:52.249 } 00:17:52.249 ] 00:17:52.249 }' 00:17:52.249 07:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.249 07:53:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.819 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:52.819 [2024-07-15 07:53:37.571430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:52.819 BaseBdev2 00:17:53.079 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:53.079 07:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:53.079 07:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:53.079 07:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:53.079 07:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:53.079 07:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:53.079 07:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:53.079 07:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:53.372 [ 00:17:53.372 { 00:17:53.372 "name": "BaseBdev2", 00:17:53.372 "aliases": [ 00:17:53.372 "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b" 00:17:53.372 ], 00:17:53.372 "product_name": "Malloc disk", 00:17:53.372 "block_size": 512, 00:17:53.372 "num_blocks": 65536, 00:17:53.372 "uuid": "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b", 00:17:53.372 "assigned_rate_limits": { 00:17:53.372 "rw_ios_per_sec": 0, 00:17:53.372 "rw_mbytes_per_sec": 0, 00:17:53.372 "r_mbytes_per_sec": 0, 00:17:53.372 "w_mbytes_per_sec": 0 00:17:53.372 }, 00:17:53.372 "claimed": true, 00:17:53.372 "claim_type": "exclusive_write", 00:17:53.372 "zoned": false, 00:17:53.372 "supported_io_types": { 00:17:53.372 "read": true, 00:17:53.372 "write": true, 00:17:53.372 "unmap": true, 00:17:53.372 "flush": true, 00:17:53.372 "reset": true, 00:17:53.372 "nvme_admin": false, 00:17:53.372 "nvme_io": false, 00:17:53.372 "nvme_io_md": false, 00:17:53.372 "write_zeroes": true, 00:17:53.372 "zcopy": true, 00:17:53.372 "get_zone_info": false, 00:17:53.372 "zone_management": false, 00:17:53.372 "zone_append": false, 00:17:53.372 "compare": false, 00:17:53.372 "compare_and_write": false, 00:17:53.372 "abort": true, 00:17:53.372 "seek_hole": false, 00:17:53.372 "seek_data": false, 00:17:53.372 "copy": true, 00:17:53.372 "nvme_iov_md": false 00:17:53.372 }, 00:17:53.372 "memory_domains": [ 00:17:53.372 { 00:17:53.372 "dma_device_id": "system", 00:17:53.372 "dma_device_type": 1 00:17:53.372 }, 00:17:53.372 { 00:17:53.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.372 "dma_device_type": 2 00:17:53.372 } 00:17:53.372 ], 00:17:53.372 "driver_specific": {} 00:17:53.372 } 00:17:53.372 ] 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.372 07:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.632 07:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.632 "name": "Existed_Raid", 00:17:53.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.632 "strip_size_kb": 64, 00:17:53.632 "state": "configuring", 00:17:53.632 "raid_level": "concat", 00:17:53.632 "superblock": false, 00:17:53.632 "num_base_bdevs": 4, 00:17:53.632 "num_base_bdevs_discovered": 2, 00:17:53.632 "num_base_bdevs_operational": 4, 00:17:53.632 "base_bdevs_list": [ 00:17:53.632 { 00:17:53.632 "name": "BaseBdev1", 00:17:53.632 "uuid": "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875", 00:17:53.632 "is_configured": true, 00:17:53.632 "data_offset": 0, 00:17:53.632 "data_size": 65536 00:17:53.632 }, 00:17:53.632 { 00:17:53.632 "name": "BaseBdev2", 00:17:53.632 "uuid": "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b", 00:17:53.632 "is_configured": true, 00:17:53.632 "data_offset": 0, 00:17:53.632 "data_size": 65536 00:17:53.632 }, 00:17:53.632 { 00:17:53.632 "name": "BaseBdev3", 00:17:53.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.632 "is_configured": false, 00:17:53.632 "data_offset": 0, 00:17:53.632 "data_size": 0 00:17:53.632 }, 00:17:53.632 { 00:17:53.632 "name": "BaseBdev4", 00:17:53.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.633 "is_configured": false, 00:17:53.633 "data_offset": 0, 00:17:53.633 "data_size": 0 00:17:53.633 } 00:17:53.633 ] 00:17:53.633 }' 00:17:53.633 07:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.633 07:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.203 07:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:54.203 [2024-07-15 07:53:38.931832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:54.203 BaseBdev3 00:17:54.203 07:53:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:54.203 07:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:54.203 07:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:54.203 07:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:54.203 07:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:54.203 07:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:54.204 07:53:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:54.463 07:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:54.724 [ 00:17:54.724 { 00:17:54.724 "name": "BaseBdev3", 00:17:54.724 "aliases": [ 00:17:54.724 "b5151743-4cf3-463f-8f74-576c1e0bbd39" 00:17:54.724 ], 00:17:54.724 "product_name": "Malloc disk", 00:17:54.724 "block_size": 512, 00:17:54.724 "num_blocks": 65536, 00:17:54.724 "uuid": "b5151743-4cf3-463f-8f74-576c1e0bbd39", 00:17:54.724 "assigned_rate_limits": { 00:17:54.724 "rw_ios_per_sec": 0, 00:17:54.724 "rw_mbytes_per_sec": 0, 00:17:54.724 "r_mbytes_per_sec": 0, 00:17:54.724 "w_mbytes_per_sec": 0 00:17:54.724 }, 00:17:54.724 "claimed": true, 00:17:54.724 "claim_type": "exclusive_write", 00:17:54.724 "zoned": false, 00:17:54.724 "supported_io_types": { 00:17:54.724 "read": true, 00:17:54.724 "write": true, 00:17:54.724 "unmap": true, 00:17:54.724 "flush": true, 00:17:54.724 "reset": true, 00:17:54.724 "nvme_admin": false, 00:17:54.724 "nvme_io": false, 00:17:54.724 "nvme_io_md": false, 00:17:54.724 "write_zeroes": true, 00:17:54.724 "zcopy": true, 00:17:54.724 "get_zone_info": false, 00:17:54.724 "zone_management": false, 00:17:54.724 "zone_append": false, 00:17:54.724 "compare": false, 00:17:54.724 "compare_and_write": false, 00:17:54.724 "abort": true, 00:17:54.724 "seek_hole": false, 00:17:54.724 "seek_data": false, 00:17:54.724 "copy": true, 00:17:54.724 "nvme_iov_md": false 00:17:54.724 }, 00:17:54.724 "memory_domains": [ 00:17:54.724 { 00:17:54.724 "dma_device_id": "system", 00:17:54.724 "dma_device_type": 1 00:17:54.724 }, 00:17:54.724 { 00:17:54.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.724 "dma_device_type": 2 00:17:54.724 } 00:17:54.724 ], 00:17:54.724 "driver_specific": {} 00:17:54.724 } 00:17:54.724 ] 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.724 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.984 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.984 "name": "Existed_Raid", 00:17:54.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.984 "strip_size_kb": 64, 00:17:54.984 "state": "configuring", 00:17:54.984 "raid_level": "concat", 00:17:54.984 "superblock": false, 00:17:54.984 "num_base_bdevs": 4, 00:17:54.984 "num_base_bdevs_discovered": 3, 00:17:54.984 "num_base_bdevs_operational": 4, 00:17:54.984 "base_bdevs_list": [ 00:17:54.984 { 00:17:54.984 "name": "BaseBdev1", 00:17:54.984 "uuid": "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875", 00:17:54.984 "is_configured": true, 00:17:54.984 "data_offset": 0, 00:17:54.984 "data_size": 65536 00:17:54.984 }, 00:17:54.984 { 00:17:54.984 "name": "BaseBdev2", 00:17:54.984 "uuid": "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b", 00:17:54.984 "is_configured": true, 00:17:54.984 "data_offset": 0, 00:17:54.984 "data_size": 65536 00:17:54.984 }, 00:17:54.984 { 00:17:54.984 "name": "BaseBdev3", 00:17:54.984 "uuid": "b5151743-4cf3-463f-8f74-576c1e0bbd39", 00:17:54.984 "is_configured": true, 00:17:54.984 "data_offset": 0, 00:17:54.984 "data_size": 65536 00:17:54.984 }, 00:17:54.984 { 00:17:54.984 "name": "BaseBdev4", 00:17:54.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.984 "is_configured": false, 00:17:54.984 "data_offset": 0, 00:17:54.984 "data_size": 0 00:17:54.984 } 00:17:54.984 ] 00:17:54.984 }' 00:17:54.984 07:53:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.984 07:53:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.554 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:55.554 [2024-07-15 07:53:40.216369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:55.554 [2024-07-15 07:53:40.216396] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2793fc0 00:17:55.554 [2024-07-15 07:53:40.216400] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:55.554 [2024-07-15 07:53:40.216561] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2793c00 00:17:55.554 [2024-07-15 07:53:40.216652] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2793fc0 00:17:55.554 [2024-07-15 07:53:40.216658] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2793fc0 00:17:55.554 [2024-07-15 07:53:40.216786] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:55.554 BaseBdev4 00:17:55.554 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:55.554 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:55.554 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:55.554 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:55.554 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:55.554 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:55.554 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:55.813 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:56.073 [ 00:17:56.073 { 00:17:56.073 "name": "BaseBdev4", 00:17:56.073 "aliases": [ 00:17:56.073 "536e3c22-74b2-44cd-bda0-cd6cf856bd0d" 00:17:56.073 ], 00:17:56.073 "product_name": "Malloc disk", 00:17:56.073 "block_size": 512, 00:17:56.073 "num_blocks": 65536, 00:17:56.073 "uuid": "536e3c22-74b2-44cd-bda0-cd6cf856bd0d", 00:17:56.073 "assigned_rate_limits": { 00:17:56.073 "rw_ios_per_sec": 0, 00:17:56.073 "rw_mbytes_per_sec": 0, 00:17:56.073 "r_mbytes_per_sec": 0, 00:17:56.073 "w_mbytes_per_sec": 0 00:17:56.073 }, 00:17:56.073 "claimed": true, 00:17:56.073 "claim_type": "exclusive_write", 00:17:56.073 "zoned": false, 00:17:56.073 "supported_io_types": { 00:17:56.073 "read": true, 00:17:56.073 "write": true, 00:17:56.073 "unmap": true, 00:17:56.073 "flush": true, 00:17:56.073 "reset": true, 00:17:56.073 "nvme_admin": false, 00:17:56.073 "nvme_io": false, 00:17:56.073 "nvme_io_md": false, 00:17:56.073 "write_zeroes": true, 00:17:56.073 "zcopy": true, 00:17:56.073 "get_zone_info": false, 00:17:56.073 "zone_management": false, 00:17:56.073 "zone_append": false, 00:17:56.073 "compare": false, 00:17:56.073 "compare_and_write": false, 00:17:56.073 "abort": true, 00:17:56.073 "seek_hole": false, 00:17:56.073 "seek_data": false, 00:17:56.073 "copy": true, 00:17:56.073 "nvme_iov_md": false 00:17:56.073 }, 00:17:56.073 "memory_domains": [ 00:17:56.073 { 00:17:56.073 "dma_device_id": "system", 00:17:56.073 "dma_device_type": 1 00:17:56.073 }, 00:17:56.073 { 00:17:56.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.073 "dma_device_type": 2 00:17:56.073 } 00:17:56.073 ], 00:17:56.073 "driver_specific": {} 00:17:56.073 } 00:17:56.073 ] 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.073 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.073 "name": "Existed_Raid", 00:17:56.073 "uuid": "d5f3e181-43cb-482a-874a-9b131c1a8aae", 00:17:56.073 "strip_size_kb": 64, 00:17:56.073 "state": "online", 00:17:56.073 "raid_level": "concat", 00:17:56.073 "superblock": false, 00:17:56.073 "num_base_bdevs": 4, 00:17:56.073 "num_base_bdevs_discovered": 4, 00:17:56.073 "num_base_bdevs_operational": 4, 00:17:56.073 "base_bdevs_list": [ 00:17:56.073 { 00:17:56.073 "name": "BaseBdev1", 00:17:56.073 "uuid": "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875", 00:17:56.073 "is_configured": true, 00:17:56.074 "data_offset": 0, 00:17:56.074 "data_size": 65536 00:17:56.074 }, 00:17:56.074 { 00:17:56.074 "name": "BaseBdev2", 00:17:56.074 "uuid": "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b", 00:17:56.074 "is_configured": true, 00:17:56.074 "data_offset": 0, 00:17:56.074 "data_size": 65536 00:17:56.074 }, 00:17:56.074 { 00:17:56.074 "name": "BaseBdev3", 00:17:56.074 "uuid": "b5151743-4cf3-463f-8f74-576c1e0bbd39", 00:17:56.074 "is_configured": true, 00:17:56.074 "data_offset": 0, 00:17:56.074 "data_size": 65536 00:17:56.074 }, 00:17:56.074 { 00:17:56.074 "name": "BaseBdev4", 00:17:56.074 "uuid": "536e3c22-74b2-44cd-bda0-cd6cf856bd0d", 00:17:56.074 "is_configured": true, 00:17:56.074 "data_offset": 0, 00:17:56.074 "data_size": 65536 00:17:56.074 } 00:17:56.074 ] 00:17:56.074 }' 00:17:56.074 07:53:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.074 07:53:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.642 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:56.642 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:56.642 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:56.642 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:56.642 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:56.642 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:56.642 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:56.642 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:56.902 [2024-07-15 07:53:41.503899] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:56.902 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:56.902 "name": "Existed_Raid", 00:17:56.902 "aliases": [ 00:17:56.902 "d5f3e181-43cb-482a-874a-9b131c1a8aae" 00:17:56.902 ], 00:17:56.902 "product_name": "Raid Volume", 00:17:56.902 "block_size": 512, 00:17:56.902 "num_blocks": 262144, 00:17:56.902 "uuid": "d5f3e181-43cb-482a-874a-9b131c1a8aae", 00:17:56.902 "assigned_rate_limits": { 00:17:56.902 "rw_ios_per_sec": 0, 00:17:56.902 "rw_mbytes_per_sec": 0, 00:17:56.902 "r_mbytes_per_sec": 0, 00:17:56.902 "w_mbytes_per_sec": 0 00:17:56.902 }, 00:17:56.902 "claimed": false, 00:17:56.902 "zoned": false, 00:17:56.902 "supported_io_types": { 00:17:56.902 "read": true, 00:17:56.902 "write": true, 00:17:56.902 "unmap": true, 00:17:56.902 "flush": true, 00:17:56.902 "reset": true, 00:17:56.902 "nvme_admin": false, 00:17:56.902 "nvme_io": false, 00:17:56.902 "nvme_io_md": false, 00:17:56.902 "write_zeroes": true, 00:17:56.902 "zcopy": false, 00:17:56.902 "get_zone_info": false, 00:17:56.902 "zone_management": false, 00:17:56.902 "zone_append": false, 00:17:56.902 "compare": false, 00:17:56.902 "compare_and_write": false, 00:17:56.902 "abort": false, 00:17:56.902 "seek_hole": false, 00:17:56.902 "seek_data": false, 00:17:56.902 "copy": false, 00:17:56.902 "nvme_iov_md": false 00:17:56.902 }, 00:17:56.902 "memory_domains": [ 00:17:56.902 { 00:17:56.902 "dma_device_id": "system", 00:17:56.902 "dma_device_type": 1 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.902 "dma_device_type": 2 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "dma_device_id": "system", 00:17:56.902 "dma_device_type": 1 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.902 "dma_device_type": 2 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "dma_device_id": "system", 00:17:56.902 "dma_device_type": 1 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.902 "dma_device_type": 2 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "dma_device_id": "system", 00:17:56.902 "dma_device_type": 1 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.902 "dma_device_type": 2 00:17:56.902 } 00:17:56.902 ], 00:17:56.902 "driver_specific": { 00:17:56.902 "raid": { 00:17:56.902 "uuid": "d5f3e181-43cb-482a-874a-9b131c1a8aae", 00:17:56.902 "strip_size_kb": 64, 00:17:56.902 "state": "online", 00:17:56.902 "raid_level": "concat", 00:17:56.902 "superblock": false, 00:17:56.902 "num_base_bdevs": 4, 00:17:56.902 "num_base_bdevs_discovered": 4, 00:17:56.902 "num_base_bdevs_operational": 4, 00:17:56.902 "base_bdevs_list": [ 00:17:56.902 { 00:17:56.902 "name": "BaseBdev1", 00:17:56.902 "uuid": "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875", 00:17:56.902 "is_configured": true, 00:17:56.902 "data_offset": 0, 00:17:56.902 "data_size": 65536 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "name": "BaseBdev2", 00:17:56.902 "uuid": "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b", 00:17:56.902 "is_configured": true, 00:17:56.902 "data_offset": 0, 00:17:56.902 "data_size": 65536 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "name": "BaseBdev3", 00:17:56.902 "uuid": "b5151743-4cf3-463f-8f74-576c1e0bbd39", 00:17:56.902 "is_configured": true, 00:17:56.902 "data_offset": 0, 00:17:56.902 "data_size": 65536 00:17:56.902 }, 00:17:56.902 { 00:17:56.902 "name": "BaseBdev4", 00:17:56.902 "uuid": "536e3c22-74b2-44cd-bda0-cd6cf856bd0d", 00:17:56.902 "is_configured": true, 00:17:56.902 "data_offset": 0, 00:17:56.902 "data_size": 65536 00:17:56.902 } 00:17:56.902 ] 00:17:56.903 } 00:17:56.903 } 00:17:56.903 }' 00:17:56.903 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:56.903 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:56.903 BaseBdev2 00:17:56.903 BaseBdev3 00:17:56.903 BaseBdev4' 00:17:56.903 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:56.903 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:56.903 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.162 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:57.162 "name": "BaseBdev1", 00:17:57.162 "aliases": [ 00:17:57.162 "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875" 00:17:57.162 ], 00:17:57.162 "product_name": "Malloc disk", 00:17:57.162 "block_size": 512, 00:17:57.162 "num_blocks": 65536, 00:17:57.162 "uuid": "6b2cac85-5ead-48d6-9b2c-a2b0e8a50875", 00:17:57.162 "assigned_rate_limits": { 00:17:57.162 "rw_ios_per_sec": 0, 00:17:57.162 "rw_mbytes_per_sec": 0, 00:17:57.162 "r_mbytes_per_sec": 0, 00:17:57.162 "w_mbytes_per_sec": 0 00:17:57.162 }, 00:17:57.162 "claimed": true, 00:17:57.162 "claim_type": "exclusive_write", 00:17:57.162 "zoned": false, 00:17:57.162 "supported_io_types": { 00:17:57.162 "read": true, 00:17:57.162 "write": true, 00:17:57.162 "unmap": true, 00:17:57.162 "flush": true, 00:17:57.162 "reset": true, 00:17:57.162 "nvme_admin": false, 00:17:57.162 "nvme_io": false, 00:17:57.162 "nvme_io_md": false, 00:17:57.162 "write_zeroes": true, 00:17:57.162 "zcopy": true, 00:17:57.162 "get_zone_info": false, 00:17:57.162 "zone_management": false, 00:17:57.162 "zone_append": false, 00:17:57.162 "compare": false, 00:17:57.162 "compare_and_write": false, 00:17:57.162 "abort": true, 00:17:57.162 "seek_hole": false, 00:17:57.162 "seek_data": false, 00:17:57.162 "copy": true, 00:17:57.162 "nvme_iov_md": false 00:17:57.162 }, 00:17:57.162 "memory_domains": [ 00:17:57.162 { 00:17:57.162 "dma_device_id": "system", 00:17:57.162 "dma_device_type": 1 00:17:57.162 }, 00:17:57.162 { 00:17:57.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.162 "dma_device_type": 2 00:17:57.162 } 00:17:57.162 ], 00:17:57.162 "driver_specific": {} 00:17:57.162 }' 00:17:57.162 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.162 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.162 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:57.162 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.162 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.422 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:57.422 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.422 07:53:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.422 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:57.422 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.422 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.422 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:57.422 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:57.422 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:57.422 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.681 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:57.681 "name": "BaseBdev2", 00:17:57.681 "aliases": [ 00:17:57.681 "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b" 00:17:57.681 ], 00:17:57.681 "product_name": "Malloc disk", 00:17:57.681 "block_size": 512, 00:17:57.681 "num_blocks": 65536, 00:17:57.681 "uuid": "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b", 00:17:57.681 "assigned_rate_limits": { 00:17:57.681 "rw_ios_per_sec": 0, 00:17:57.681 "rw_mbytes_per_sec": 0, 00:17:57.681 "r_mbytes_per_sec": 0, 00:17:57.681 "w_mbytes_per_sec": 0 00:17:57.681 }, 00:17:57.681 "claimed": true, 00:17:57.681 "claim_type": "exclusive_write", 00:17:57.681 "zoned": false, 00:17:57.681 "supported_io_types": { 00:17:57.681 "read": true, 00:17:57.681 "write": true, 00:17:57.681 "unmap": true, 00:17:57.681 "flush": true, 00:17:57.681 "reset": true, 00:17:57.681 "nvme_admin": false, 00:17:57.681 "nvme_io": false, 00:17:57.681 "nvme_io_md": false, 00:17:57.681 "write_zeroes": true, 00:17:57.681 "zcopy": true, 00:17:57.681 "get_zone_info": false, 00:17:57.681 "zone_management": false, 00:17:57.681 "zone_append": false, 00:17:57.681 "compare": false, 00:17:57.681 "compare_and_write": false, 00:17:57.681 "abort": true, 00:17:57.681 "seek_hole": false, 00:17:57.681 "seek_data": false, 00:17:57.681 "copy": true, 00:17:57.681 "nvme_iov_md": false 00:17:57.681 }, 00:17:57.681 "memory_domains": [ 00:17:57.681 { 00:17:57.681 "dma_device_id": "system", 00:17:57.681 "dma_device_type": 1 00:17:57.681 }, 00:17:57.681 { 00:17:57.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.681 "dma_device_type": 2 00:17:57.681 } 00:17:57.681 ], 00:17:57.681 "driver_specific": {} 00:17:57.681 }' 00:17:57.681 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.681 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.681 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:57.681 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:57.941 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:58.200 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:58.200 "name": "BaseBdev3", 00:17:58.200 "aliases": [ 00:17:58.200 "b5151743-4cf3-463f-8f74-576c1e0bbd39" 00:17:58.200 ], 00:17:58.200 "product_name": "Malloc disk", 00:17:58.200 "block_size": 512, 00:17:58.200 "num_blocks": 65536, 00:17:58.200 "uuid": "b5151743-4cf3-463f-8f74-576c1e0bbd39", 00:17:58.200 "assigned_rate_limits": { 00:17:58.200 "rw_ios_per_sec": 0, 00:17:58.200 "rw_mbytes_per_sec": 0, 00:17:58.200 "r_mbytes_per_sec": 0, 00:17:58.200 "w_mbytes_per_sec": 0 00:17:58.200 }, 00:17:58.200 "claimed": true, 00:17:58.200 "claim_type": "exclusive_write", 00:17:58.200 "zoned": false, 00:17:58.200 "supported_io_types": { 00:17:58.200 "read": true, 00:17:58.200 "write": true, 00:17:58.200 "unmap": true, 00:17:58.200 "flush": true, 00:17:58.200 "reset": true, 00:17:58.200 "nvme_admin": false, 00:17:58.200 "nvme_io": false, 00:17:58.200 "nvme_io_md": false, 00:17:58.200 "write_zeroes": true, 00:17:58.200 "zcopy": true, 00:17:58.200 "get_zone_info": false, 00:17:58.200 "zone_management": false, 00:17:58.200 "zone_append": false, 00:17:58.200 "compare": false, 00:17:58.200 "compare_and_write": false, 00:17:58.200 "abort": true, 00:17:58.200 "seek_hole": false, 00:17:58.200 "seek_data": false, 00:17:58.200 "copy": true, 00:17:58.200 "nvme_iov_md": false 00:17:58.200 }, 00:17:58.200 "memory_domains": [ 00:17:58.200 { 00:17:58.200 "dma_device_id": "system", 00:17:58.200 "dma_device_type": 1 00:17:58.200 }, 00:17:58.200 { 00:17:58.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.200 "dma_device_type": 2 00:17:58.200 } 00:17:58.200 ], 00:17:58.200 "driver_specific": {} 00:17:58.200 }' 00:17:58.200 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.200 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.200 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:58.200 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.459 07:53:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.459 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:58.459 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.459 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.459 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:58.459 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.459 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.459 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:58.459 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:58.718 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:58.718 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:58.718 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:58.718 "name": "BaseBdev4", 00:17:58.718 "aliases": [ 00:17:58.718 "536e3c22-74b2-44cd-bda0-cd6cf856bd0d" 00:17:58.718 ], 00:17:58.718 "product_name": "Malloc disk", 00:17:58.718 "block_size": 512, 00:17:58.718 "num_blocks": 65536, 00:17:58.718 "uuid": "536e3c22-74b2-44cd-bda0-cd6cf856bd0d", 00:17:58.718 "assigned_rate_limits": { 00:17:58.718 "rw_ios_per_sec": 0, 00:17:58.718 "rw_mbytes_per_sec": 0, 00:17:58.718 "r_mbytes_per_sec": 0, 00:17:58.718 "w_mbytes_per_sec": 0 00:17:58.718 }, 00:17:58.718 "claimed": true, 00:17:58.718 "claim_type": "exclusive_write", 00:17:58.718 "zoned": false, 00:17:58.718 "supported_io_types": { 00:17:58.718 "read": true, 00:17:58.718 "write": true, 00:17:58.718 "unmap": true, 00:17:58.718 "flush": true, 00:17:58.718 "reset": true, 00:17:58.718 "nvme_admin": false, 00:17:58.718 "nvme_io": false, 00:17:58.718 "nvme_io_md": false, 00:17:58.718 "write_zeroes": true, 00:17:58.718 "zcopy": true, 00:17:58.718 "get_zone_info": false, 00:17:58.718 "zone_management": false, 00:17:58.718 "zone_append": false, 00:17:58.718 "compare": false, 00:17:58.718 "compare_and_write": false, 00:17:58.718 "abort": true, 00:17:58.718 "seek_hole": false, 00:17:58.718 "seek_data": false, 00:17:58.718 "copy": true, 00:17:58.718 "nvme_iov_md": false 00:17:58.718 }, 00:17:58.718 "memory_domains": [ 00:17:58.718 { 00:17:58.718 "dma_device_id": "system", 00:17:58.718 "dma_device_type": 1 00:17:58.718 }, 00:17:58.718 { 00:17:58.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.718 "dma_device_type": 2 00:17:58.718 } 00:17:58.718 ], 00:17:58.718 "driver_specific": {} 00:17:58.718 }' 00:17:58.718 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.718 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.977 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:58.977 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.977 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.977 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:58.977 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.977 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.977 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:58.977 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:59.235 [2024-07-15 07:53:43.953861] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:59.235 [2024-07-15 07:53:43.953878] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:59.235 [2024-07-15 07:53:43.953912] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.235 07:53:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.494 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.494 "name": "Existed_Raid", 00:17:59.494 "uuid": "d5f3e181-43cb-482a-874a-9b131c1a8aae", 00:17:59.494 "strip_size_kb": 64, 00:17:59.494 "state": "offline", 00:17:59.494 "raid_level": "concat", 00:17:59.494 "superblock": false, 00:17:59.494 "num_base_bdevs": 4, 00:17:59.494 "num_base_bdevs_discovered": 3, 00:17:59.494 "num_base_bdevs_operational": 3, 00:17:59.494 "base_bdevs_list": [ 00:17:59.494 { 00:17:59.494 "name": null, 00:17:59.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.494 "is_configured": false, 00:17:59.494 "data_offset": 0, 00:17:59.494 "data_size": 65536 00:17:59.494 }, 00:17:59.494 { 00:17:59.494 "name": "BaseBdev2", 00:17:59.494 "uuid": "ca487ce4-e8d5-44a6-b337-0bf8d5ea195b", 00:17:59.494 "is_configured": true, 00:17:59.494 "data_offset": 0, 00:17:59.494 "data_size": 65536 00:17:59.494 }, 00:17:59.494 { 00:17:59.494 "name": "BaseBdev3", 00:17:59.494 "uuid": "b5151743-4cf3-463f-8f74-576c1e0bbd39", 00:17:59.494 "is_configured": true, 00:17:59.494 "data_offset": 0, 00:17:59.494 "data_size": 65536 00:17:59.494 }, 00:17:59.494 { 00:17:59.494 "name": "BaseBdev4", 00:17:59.494 "uuid": "536e3c22-74b2-44cd-bda0-cd6cf856bd0d", 00:17:59.494 "is_configured": true, 00:17:59.494 "data_offset": 0, 00:17:59.494 "data_size": 65536 00:17:59.494 } 00:17:59.494 ] 00:17:59.494 }' 00:17:59.494 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.494 07:53:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.062 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:00.062 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:00.062 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.062 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:00.320 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:00.320 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:00.320 07:53:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:00.320 [2024-07-15 07:53:45.040610] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:00.320 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:00.321 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:00.321 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.321 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:00.629 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:00.629 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:00.629 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:00.888 [2024-07-15 07:53:45.427378] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:00.888 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:00.888 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:00.888 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:00.888 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.888 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:00.888 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:00.888 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:01.147 [2024-07-15 07:53:45.814127] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:01.147 [2024-07-15 07:53:45.814154] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2793fc0 name Existed_Raid, state offline 00:18:01.147 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:01.147 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:01.147 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.147 07:53:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:01.407 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:01.407 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:01.407 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:01.407 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:01.407 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:01.407 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:01.666 BaseBdev2 00:18:01.666 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:01.666 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:01.666 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:01.666 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:01.666 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:01.666 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:01.666 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:01.666 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:01.925 [ 00:18:01.925 { 00:18:01.925 "name": "BaseBdev2", 00:18:01.925 "aliases": [ 00:18:01.925 "52e1c1a1-8a2a-446e-abbc-b98a9027f45c" 00:18:01.925 ], 00:18:01.925 "product_name": "Malloc disk", 00:18:01.925 "block_size": 512, 00:18:01.925 "num_blocks": 65536, 00:18:01.925 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:01.925 "assigned_rate_limits": { 00:18:01.925 "rw_ios_per_sec": 0, 00:18:01.925 "rw_mbytes_per_sec": 0, 00:18:01.925 "r_mbytes_per_sec": 0, 00:18:01.925 "w_mbytes_per_sec": 0 00:18:01.925 }, 00:18:01.925 "claimed": false, 00:18:01.925 "zoned": false, 00:18:01.925 "supported_io_types": { 00:18:01.925 "read": true, 00:18:01.925 "write": true, 00:18:01.925 "unmap": true, 00:18:01.925 "flush": true, 00:18:01.925 "reset": true, 00:18:01.925 "nvme_admin": false, 00:18:01.925 "nvme_io": false, 00:18:01.925 "nvme_io_md": false, 00:18:01.925 "write_zeroes": true, 00:18:01.925 "zcopy": true, 00:18:01.925 "get_zone_info": false, 00:18:01.925 "zone_management": false, 00:18:01.925 "zone_append": false, 00:18:01.925 "compare": false, 00:18:01.925 "compare_and_write": false, 00:18:01.925 "abort": true, 00:18:01.925 "seek_hole": false, 00:18:01.925 "seek_data": false, 00:18:01.925 "copy": true, 00:18:01.925 "nvme_iov_md": false 00:18:01.925 }, 00:18:01.925 "memory_domains": [ 00:18:01.925 { 00:18:01.925 "dma_device_id": "system", 00:18:01.925 "dma_device_type": 1 00:18:01.925 }, 00:18:01.925 { 00:18:01.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.925 "dma_device_type": 2 00:18:01.925 } 00:18:01.925 ], 00:18:01.925 "driver_specific": {} 00:18:01.925 } 00:18:01.925 ] 00:18:01.925 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:01.925 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:01.925 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:01.925 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:02.185 BaseBdev3 00:18:02.185 07:53:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:02.185 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:02.185 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:02.185 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:02.185 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:02.185 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:02.185 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:02.185 07:53:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:02.445 [ 00:18:02.445 { 00:18:02.445 "name": "BaseBdev3", 00:18:02.445 "aliases": [ 00:18:02.445 "a0431b84-a725-49ef-872c-a0367ff377b6" 00:18:02.445 ], 00:18:02.445 "product_name": "Malloc disk", 00:18:02.445 "block_size": 512, 00:18:02.445 "num_blocks": 65536, 00:18:02.445 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:02.445 "assigned_rate_limits": { 00:18:02.445 "rw_ios_per_sec": 0, 00:18:02.445 "rw_mbytes_per_sec": 0, 00:18:02.445 "r_mbytes_per_sec": 0, 00:18:02.445 "w_mbytes_per_sec": 0 00:18:02.445 }, 00:18:02.445 "claimed": false, 00:18:02.445 "zoned": false, 00:18:02.445 "supported_io_types": { 00:18:02.445 "read": true, 00:18:02.445 "write": true, 00:18:02.445 "unmap": true, 00:18:02.445 "flush": true, 00:18:02.445 "reset": true, 00:18:02.445 "nvme_admin": false, 00:18:02.445 "nvme_io": false, 00:18:02.445 "nvme_io_md": false, 00:18:02.445 "write_zeroes": true, 00:18:02.445 "zcopy": true, 00:18:02.445 "get_zone_info": false, 00:18:02.445 "zone_management": false, 00:18:02.445 "zone_append": false, 00:18:02.445 "compare": false, 00:18:02.445 "compare_and_write": false, 00:18:02.445 "abort": true, 00:18:02.445 "seek_hole": false, 00:18:02.445 "seek_data": false, 00:18:02.445 "copy": true, 00:18:02.445 "nvme_iov_md": false 00:18:02.445 }, 00:18:02.445 "memory_domains": [ 00:18:02.445 { 00:18:02.445 "dma_device_id": "system", 00:18:02.445 "dma_device_type": 1 00:18:02.445 }, 00:18:02.445 { 00:18:02.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.445 "dma_device_type": 2 00:18:02.445 } 00:18:02.445 ], 00:18:02.445 "driver_specific": {} 00:18:02.445 } 00:18:02.445 ] 00:18:02.445 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:02.445 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:02.445 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:02.445 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:02.706 BaseBdev4 00:18:02.706 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:02.706 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:02.706 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:02.706 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:02.706 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:02.706 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:02.706 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:02.966 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:02.966 [ 00:18:02.966 { 00:18:02.966 "name": "BaseBdev4", 00:18:02.966 "aliases": [ 00:18:02.966 "99e24c18-ef1d-4d42-87e9-d8a2afc11946" 00:18:02.966 ], 00:18:02.966 "product_name": "Malloc disk", 00:18:02.966 "block_size": 512, 00:18:02.966 "num_blocks": 65536, 00:18:02.966 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:02.966 "assigned_rate_limits": { 00:18:02.966 "rw_ios_per_sec": 0, 00:18:02.966 "rw_mbytes_per_sec": 0, 00:18:02.966 "r_mbytes_per_sec": 0, 00:18:02.966 "w_mbytes_per_sec": 0 00:18:02.966 }, 00:18:02.966 "claimed": false, 00:18:02.966 "zoned": false, 00:18:02.966 "supported_io_types": { 00:18:02.966 "read": true, 00:18:02.966 "write": true, 00:18:02.966 "unmap": true, 00:18:02.966 "flush": true, 00:18:02.966 "reset": true, 00:18:02.966 "nvme_admin": false, 00:18:02.966 "nvme_io": false, 00:18:02.966 "nvme_io_md": false, 00:18:02.966 "write_zeroes": true, 00:18:02.966 "zcopy": true, 00:18:02.966 "get_zone_info": false, 00:18:02.966 "zone_management": false, 00:18:02.966 "zone_append": false, 00:18:02.966 "compare": false, 00:18:02.966 "compare_and_write": false, 00:18:02.966 "abort": true, 00:18:02.966 "seek_hole": false, 00:18:02.966 "seek_data": false, 00:18:02.966 "copy": true, 00:18:02.966 "nvme_iov_md": false 00:18:02.966 }, 00:18:02.966 "memory_domains": [ 00:18:02.966 { 00:18:02.966 "dma_device_id": "system", 00:18:02.966 "dma_device_type": 1 00:18:02.966 }, 00:18:02.966 { 00:18:02.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.966 "dma_device_type": 2 00:18:02.966 } 00:18:02.966 ], 00:18:02.966 "driver_specific": {} 00:18:02.966 } 00:18:02.966 ] 00:18:02.966 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:02.966 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:02.966 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:02.966 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:03.227 [2024-07-15 07:53:47.816855] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:03.227 [2024-07-15 07:53:47.816882] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:03.227 [2024-07-15 07:53:47.816895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:03.227 [2024-07-15 07:53:47.817926] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:03.227 [2024-07-15 07:53:47.817956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.227 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.487 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.487 "name": "Existed_Raid", 00:18:03.487 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.487 "strip_size_kb": 64, 00:18:03.487 "state": "configuring", 00:18:03.487 "raid_level": "concat", 00:18:03.487 "superblock": false, 00:18:03.487 "num_base_bdevs": 4, 00:18:03.487 "num_base_bdevs_discovered": 3, 00:18:03.487 "num_base_bdevs_operational": 4, 00:18:03.487 "base_bdevs_list": [ 00:18:03.487 { 00:18:03.487 "name": "BaseBdev1", 00:18:03.487 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.487 "is_configured": false, 00:18:03.487 "data_offset": 0, 00:18:03.487 "data_size": 0 00:18:03.487 }, 00:18:03.487 { 00:18:03.487 "name": "BaseBdev2", 00:18:03.487 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:03.487 "is_configured": true, 00:18:03.487 "data_offset": 0, 00:18:03.487 "data_size": 65536 00:18:03.487 }, 00:18:03.487 { 00:18:03.487 "name": "BaseBdev3", 00:18:03.487 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:03.487 "is_configured": true, 00:18:03.487 "data_offset": 0, 00:18:03.487 "data_size": 65536 00:18:03.487 }, 00:18:03.487 { 00:18:03.487 "name": "BaseBdev4", 00:18:03.487 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:03.487 "is_configured": true, 00:18:03.487 "data_offset": 0, 00:18:03.487 "data_size": 65536 00:18:03.487 } 00:18:03.487 ] 00:18:03.487 }' 00:18:03.487 07:53:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.487 07:53:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:03.746 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:04.005 [2024-07-15 07:53:48.650928] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.005 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:04.265 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.265 "name": "Existed_Raid", 00:18:04.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.265 "strip_size_kb": 64, 00:18:04.265 "state": "configuring", 00:18:04.265 "raid_level": "concat", 00:18:04.265 "superblock": false, 00:18:04.265 "num_base_bdevs": 4, 00:18:04.265 "num_base_bdevs_discovered": 2, 00:18:04.265 "num_base_bdevs_operational": 4, 00:18:04.265 "base_bdevs_list": [ 00:18:04.265 { 00:18:04.265 "name": "BaseBdev1", 00:18:04.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.265 "is_configured": false, 00:18:04.265 "data_offset": 0, 00:18:04.265 "data_size": 0 00:18:04.265 }, 00:18:04.265 { 00:18:04.265 "name": null, 00:18:04.265 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:04.265 "is_configured": false, 00:18:04.265 "data_offset": 0, 00:18:04.265 "data_size": 65536 00:18:04.265 }, 00:18:04.265 { 00:18:04.265 "name": "BaseBdev3", 00:18:04.265 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:04.265 "is_configured": true, 00:18:04.265 "data_offset": 0, 00:18:04.265 "data_size": 65536 00:18:04.265 }, 00:18:04.265 { 00:18:04.265 "name": "BaseBdev4", 00:18:04.265 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:04.265 "is_configured": true, 00:18:04.265 "data_offset": 0, 00:18:04.265 "data_size": 65536 00:18:04.265 } 00:18:04.265 ] 00:18:04.265 }' 00:18:04.265 07:53:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.265 07:53:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.834 07:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.834 07:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:05.094 07:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:05.094 07:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:05.355 [2024-07-15 07:53:49.859045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:05.355 BaseBdev1 00:18:05.355 07:53:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:05.355 07:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:05.355 07:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:05.355 07:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:05.355 07:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:05.355 07:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:05.355 07:53:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:05.355 07:53:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:05.615 [ 00:18:05.615 { 00:18:05.615 "name": "BaseBdev1", 00:18:05.615 "aliases": [ 00:18:05.615 "204957ae-19ea-4d33-afc8-3d4b5cedba1d" 00:18:05.615 ], 00:18:05.615 "product_name": "Malloc disk", 00:18:05.615 "block_size": 512, 00:18:05.615 "num_blocks": 65536, 00:18:05.615 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:05.615 "assigned_rate_limits": { 00:18:05.615 "rw_ios_per_sec": 0, 00:18:05.615 "rw_mbytes_per_sec": 0, 00:18:05.615 "r_mbytes_per_sec": 0, 00:18:05.615 "w_mbytes_per_sec": 0 00:18:05.615 }, 00:18:05.615 "claimed": true, 00:18:05.615 "claim_type": "exclusive_write", 00:18:05.615 "zoned": false, 00:18:05.615 "supported_io_types": { 00:18:05.615 "read": true, 00:18:05.615 "write": true, 00:18:05.615 "unmap": true, 00:18:05.615 "flush": true, 00:18:05.615 "reset": true, 00:18:05.615 "nvme_admin": false, 00:18:05.615 "nvme_io": false, 00:18:05.615 "nvme_io_md": false, 00:18:05.615 "write_zeroes": true, 00:18:05.615 "zcopy": true, 00:18:05.615 "get_zone_info": false, 00:18:05.615 "zone_management": false, 00:18:05.615 "zone_append": false, 00:18:05.615 "compare": false, 00:18:05.615 "compare_and_write": false, 00:18:05.615 "abort": true, 00:18:05.615 "seek_hole": false, 00:18:05.615 "seek_data": false, 00:18:05.615 "copy": true, 00:18:05.615 "nvme_iov_md": false 00:18:05.615 }, 00:18:05.615 "memory_domains": [ 00:18:05.615 { 00:18:05.615 "dma_device_id": "system", 00:18:05.615 "dma_device_type": 1 00:18:05.615 }, 00:18:05.615 { 00:18:05.615 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.615 "dma_device_type": 2 00:18:05.615 } 00:18:05.615 ], 00:18:05.615 "driver_specific": {} 00:18:05.615 } 00:18:05.615 ] 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.615 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:05.876 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.876 "name": "Existed_Raid", 00:18:05.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.876 "strip_size_kb": 64, 00:18:05.876 "state": "configuring", 00:18:05.876 "raid_level": "concat", 00:18:05.876 "superblock": false, 00:18:05.876 "num_base_bdevs": 4, 00:18:05.876 "num_base_bdevs_discovered": 3, 00:18:05.876 "num_base_bdevs_operational": 4, 00:18:05.876 "base_bdevs_list": [ 00:18:05.876 { 00:18:05.876 "name": "BaseBdev1", 00:18:05.876 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:05.876 "is_configured": true, 00:18:05.876 "data_offset": 0, 00:18:05.876 "data_size": 65536 00:18:05.876 }, 00:18:05.876 { 00:18:05.876 "name": null, 00:18:05.876 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:05.876 "is_configured": false, 00:18:05.876 "data_offset": 0, 00:18:05.876 "data_size": 65536 00:18:05.876 }, 00:18:05.876 { 00:18:05.876 "name": "BaseBdev3", 00:18:05.876 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:05.876 "is_configured": true, 00:18:05.876 "data_offset": 0, 00:18:05.876 "data_size": 65536 00:18:05.876 }, 00:18:05.876 { 00:18:05.876 "name": "BaseBdev4", 00:18:05.876 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:05.876 "is_configured": true, 00:18:05.876 "data_offset": 0, 00:18:05.876 "data_size": 65536 00:18:05.876 } 00:18:05.876 ] 00:18:05.876 }' 00:18:05.876 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.876 07:53:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.444 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.444 07:53:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:06.444 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:06.444 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:06.703 [2024-07-15 07:53:51.362878] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.703 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.963 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.963 "name": "Existed_Raid", 00:18:06.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.963 "strip_size_kb": 64, 00:18:06.963 "state": "configuring", 00:18:06.963 "raid_level": "concat", 00:18:06.963 "superblock": false, 00:18:06.963 "num_base_bdevs": 4, 00:18:06.963 "num_base_bdevs_discovered": 2, 00:18:06.963 "num_base_bdevs_operational": 4, 00:18:06.963 "base_bdevs_list": [ 00:18:06.963 { 00:18:06.963 "name": "BaseBdev1", 00:18:06.963 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:06.963 "is_configured": true, 00:18:06.963 "data_offset": 0, 00:18:06.963 "data_size": 65536 00:18:06.963 }, 00:18:06.963 { 00:18:06.963 "name": null, 00:18:06.963 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:06.963 "is_configured": false, 00:18:06.963 "data_offset": 0, 00:18:06.963 "data_size": 65536 00:18:06.963 }, 00:18:06.963 { 00:18:06.963 "name": null, 00:18:06.963 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:06.963 "is_configured": false, 00:18:06.963 "data_offset": 0, 00:18:06.963 "data_size": 65536 00:18:06.963 }, 00:18:06.963 { 00:18:06.963 "name": "BaseBdev4", 00:18:06.963 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:06.963 "is_configured": true, 00:18:06.963 "data_offset": 0, 00:18:06.963 "data_size": 65536 00:18:06.963 } 00:18:06.963 ] 00:18:06.963 }' 00:18:06.963 07:53:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.963 07:53:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.547 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.547 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:08.130 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:08.130 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:08.390 [2024-07-15 07:53:52.982993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:08.390 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:08.390 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.390 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.390 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:08.390 07:53:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:08.390 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:08.390 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.390 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.390 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.390 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.390 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.390 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.960 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.960 "name": "Existed_Raid", 00:18:08.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.960 "strip_size_kb": 64, 00:18:08.960 "state": "configuring", 00:18:08.960 "raid_level": "concat", 00:18:08.960 "superblock": false, 00:18:08.960 "num_base_bdevs": 4, 00:18:08.960 "num_base_bdevs_discovered": 3, 00:18:08.960 "num_base_bdevs_operational": 4, 00:18:08.960 "base_bdevs_list": [ 00:18:08.960 { 00:18:08.960 "name": "BaseBdev1", 00:18:08.960 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:08.960 "is_configured": true, 00:18:08.960 "data_offset": 0, 00:18:08.960 "data_size": 65536 00:18:08.960 }, 00:18:08.960 { 00:18:08.960 "name": null, 00:18:08.960 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:08.960 "is_configured": false, 00:18:08.960 "data_offset": 0, 00:18:08.960 "data_size": 65536 00:18:08.960 }, 00:18:08.960 { 00:18:08.960 "name": "BaseBdev3", 00:18:08.960 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:08.960 "is_configured": true, 00:18:08.960 "data_offset": 0, 00:18:08.960 "data_size": 65536 00:18:08.960 }, 00:18:08.960 { 00:18:08.960 "name": "BaseBdev4", 00:18:08.960 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:08.960 "is_configured": true, 00:18:08.960 "data_offset": 0, 00:18:08.960 "data_size": 65536 00:18:08.960 } 00:18:08.960 ] 00:18:08.960 }' 00:18:08.960 07:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.960 07:53:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.529 07:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.529 07:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:10.099 07:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:10.099 07:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:10.361 [2024-07-15 07:53:54.984141] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.361 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.622 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.622 "name": "Existed_Raid", 00:18:10.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.622 "strip_size_kb": 64, 00:18:10.622 "state": "configuring", 00:18:10.622 "raid_level": "concat", 00:18:10.622 "superblock": false, 00:18:10.622 "num_base_bdevs": 4, 00:18:10.622 "num_base_bdevs_discovered": 2, 00:18:10.622 "num_base_bdevs_operational": 4, 00:18:10.622 "base_bdevs_list": [ 00:18:10.622 { 00:18:10.622 "name": null, 00:18:10.622 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:10.622 "is_configured": false, 00:18:10.622 "data_offset": 0, 00:18:10.622 "data_size": 65536 00:18:10.622 }, 00:18:10.622 { 00:18:10.622 "name": null, 00:18:10.622 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:10.622 "is_configured": false, 00:18:10.622 "data_offset": 0, 00:18:10.622 "data_size": 65536 00:18:10.622 }, 00:18:10.622 { 00:18:10.622 "name": "BaseBdev3", 00:18:10.622 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:10.622 "is_configured": true, 00:18:10.622 "data_offset": 0, 00:18:10.622 "data_size": 65536 00:18:10.622 }, 00:18:10.622 { 00:18:10.622 "name": "BaseBdev4", 00:18:10.622 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:10.622 "is_configured": true, 00:18:10.622 "data_offset": 0, 00:18:10.622 "data_size": 65536 00:18:10.622 } 00:18:10.622 ] 00:18:10.622 }' 00:18:10.622 07:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.622 07:53:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.563 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.563 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:11.824 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:11.824 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:12.085 [2024-07-15 07:53:56.642243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.085 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.345 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.345 "name": "Existed_Raid", 00:18:12.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.345 "strip_size_kb": 64, 00:18:12.345 "state": "configuring", 00:18:12.345 "raid_level": "concat", 00:18:12.345 "superblock": false, 00:18:12.345 "num_base_bdevs": 4, 00:18:12.345 "num_base_bdevs_discovered": 3, 00:18:12.345 "num_base_bdevs_operational": 4, 00:18:12.345 "base_bdevs_list": [ 00:18:12.345 { 00:18:12.345 "name": null, 00:18:12.345 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:12.345 "is_configured": false, 00:18:12.345 "data_offset": 0, 00:18:12.345 "data_size": 65536 00:18:12.345 }, 00:18:12.345 { 00:18:12.345 "name": "BaseBdev2", 00:18:12.345 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:12.345 "is_configured": true, 00:18:12.345 "data_offset": 0, 00:18:12.345 "data_size": 65536 00:18:12.345 }, 00:18:12.345 { 00:18:12.345 "name": "BaseBdev3", 00:18:12.345 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:12.345 "is_configured": true, 00:18:12.345 "data_offset": 0, 00:18:12.345 "data_size": 65536 00:18:12.345 }, 00:18:12.345 { 00:18:12.345 "name": "BaseBdev4", 00:18:12.345 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:12.345 "is_configured": true, 00:18:12.345 "data_offset": 0, 00:18:12.345 "data_size": 65536 00:18:12.345 } 00:18:12.345 ] 00:18:12.345 }' 00:18:12.345 07:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.345 07:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.287 07:53:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.287 07:53:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:13.859 07:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:13.859 07:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.859 07:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:13.859 07:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 204957ae-19ea-4d33-afc8-3d4b5cedba1d 00:18:14.119 [2024-07-15 07:53:58.744586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:14.119 [2024-07-15 07:53:58.744610] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2793ba0 00:18:14.119 [2024-07-15 07:53:58.744615] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:14.119 [2024-07-15 07:53:58.744768] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2798bd0 00:18:14.119 [2024-07-15 07:53:58.744859] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2793ba0 00:18:14.119 [2024-07-15 07:53:58.744865] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2793ba0 00:18:14.119 [2024-07-15 07:53:58.744988] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:14.119 NewBaseBdev 00:18:14.119 07:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:14.119 07:53:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:14.119 07:53:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:14.119 07:53:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:14.119 07:53:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:14.119 07:53:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:14.119 07:53:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:14.379 07:53:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:14.379 [ 00:18:14.379 { 00:18:14.379 "name": "NewBaseBdev", 00:18:14.379 "aliases": [ 00:18:14.379 "204957ae-19ea-4d33-afc8-3d4b5cedba1d" 00:18:14.379 ], 00:18:14.379 "product_name": "Malloc disk", 00:18:14.379 "block_size": 512, 00:18:14.379 "num_blocks": 65536, 00:18:14.379 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:14.379 "assigned_rate_limits": { 00:18:14.379 "rw_ios_per_sec": 0, 00:18:14.379 "rw_mbytes_per_sec": 0, 00:18:14.379 "r_mbytes_per_sec": 0, 00:18:14.379 "w_mbytes_per_sec": 0 00:18:14.379 }, 00:18:14.379 "claimed": true, 00:18:14.379 "claim_type": "exclusive_write", 00:18:14.379 "zoned": false, 00:18:14.379 "supported_io_types": { 00:18:14.379 "read": true, 00:18:14.379 "write": true, 00:18:14.379 "unmap": true, 00:18:14.379 "flush": true, 00:18:14.379 "reset": true, 00:18:14.379 "nvme_admin": false, 00:18:14.379 "nvme_io": false, 00:18:14.379 "nvme_io_md": false, 00:18:14.379 "write_zeroes": true, 00:18:14.379 "zcopy": true, 00:18:14.379 "get_zone_info": false, 00:18:14.379 "zone_management": false, 00:18:14.379 "zone_append": false, 00:18:14.379 "compare": false, 00:18:14.379 "compare_and_write": false, 00:18:14.379 "abort": true, 00:18:14.379 "seek_hole": false, 00:18:14.379 "seek_data": false, 00:18:14.379 "copy": true, 00:18:14.379 "nvme_iov_md": false 00:18:14.379 }, 00:18:14.379 "memory_domains": [ 00:18:14.379 { 00:18:14.379 "dma_device_id": "system", 00:18:14.379 "dma_device_type": 1 00:18:14.379 }, 00:18:14.379 { 00:18:14.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.379 "dma_device_type": 2 00:18:14.379 } 00:18:14.379 ], 00:18:14.379 "driver_specific": {} 00:18:14.379 } 00:18:14.379 ] 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.639 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:14.640 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.640 "name": "Existed_Raid", 00:18:14.640 "uuid": "0449c1ef-0919-44e6-a70d-9f53c21f55df", 00:18:14.640 "strip_size_kb": 64, 00:18:14.640 "state": "online", 00:18:14.640 "raid_level": "concat", 00:18:14.640 "superblock": false, 00:18:14.640 "num_base_bdevs": 4, 00:18:14.640 "num_base_bdevs_discovered": 4, 00:18:14.640 "num_base_bdevs_operational": 4, 00:18:14.640 "base_bdevs_list": [ 00:18:14.640 { 00:18:14.640 "name": "NewBaseBdev", 00:18:14.640 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:14.640 "is_configured": true, 00:18:14.640 "data_offset": 0, 00:18:14.640 "data_size": 65536 00:18:14.640 }, 00:18:14.640 { 00:18:14.640 "name": "BaseBdev2", 00:18:14.640 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:14.640 "is_configured": true, 00:18:14.640 "data_offset": 0, 00:18:14.640 "data_size": 65536 00:18:14.640 }, 00:18:14.640 { 00:18:14.640 "name": "BaseBdev3", 00:18:14.640 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:14.640 "is_configured": true, 00:18:14.640 "data_offset": 0, 00:18:14.640 "data_size": 65536 00:18:14.640 }, 00:18:14.640 { 00:18:14.640 "name": "BaseBdev4", 00:18:14.640 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:14.640 "is_configured": true, 00:18:14.640 "data_offset": 0, 00:18:14.640 "data_size": 65536 00:18:14.640 } 00:18:14.640 ] 00:18:14.640 }' 00:18:14.640 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.640 07:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.210 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:15.210 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:15.210 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:15.210 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:15.210 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:15.210 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:15.210 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:15.210 07:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:15.470 [2024-07-15 07:54:00.128477] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:15.470 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:15.470 "name": "Existed_Raid", 00:18:15.470 "aliases": [ 00:18:15.470 "0449c1ef-0919-44e6-a70d-9f53c21f55df" 00:18:15.470 ], 00:18:15.470 "product_name": "Raid Volume", 00:18:15.470 "block_size": 512, 00:18:15.470 "num_blocks": 262144, 00:18:15.470 "uuid": "0449c1ef-0919-44e6-a70d-9f53c21f55df", 00:18:15.471 "assigned_rate_limits": { 00:18:15.471 "rw_ios_per_sec": 0, 00:18:15.471 "rw_mbytes_per_sec": 0, 00:18:15.471 "r_mbytes_per_sec": 0, 00:18:15.471 "w_mbytes_per_sec": 0 00:18:15.471 }, 00:18:15.471 "claimed": false, 00:18:15.471 "zoned": false, 00:18:15.471 "supported_io_types": { 00:18:15.471 "read": true, 00:18:15.471 "write": true, 00:18:15.471 "unmap": true, 00:18:15.471 "flush": true, 00:18:15.471 "reset": true, 00:18:15.471 "nvme_admin": false, 00:18:15.471 "nvme_io": false, 00:18:15.471 "nvme_io_md": false, 00:18:15.471 "write_zeroes": true, 00:18:15.471 "zcopy": false, 00:18:15.471 "get_zone_info": false, 00:18:15.471 "zone_management": false, 00:18:15.471 "zone_append": false, 00:18:15.471 "compare": false, 00:18:15.471 "compare_and_write": false, 00:18:15.471 "abort": false, 00:18:15.471 "seek_hole": false, 00:18:15.471 "seek_data": false, 00:18:15.471 "copy": false, 00:18:15.471 "nvme_iov_md": false 00:18:15.471 }, 00:18:15.471 "memory_domains": [ 00:18:15.471 { 00:18:15.471 "dma_device_id": "system", 00:18:15.471 "dma_device_type": 1 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.471 "dma_device_type": 2 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "dma_device_id": "system", 00:18:15.471 "dma_device_type": 1 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.471 "dma_device_type": 2 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "dma_device_id": "system", 00:18:15.471 "dma_device_type": 1 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.471 "dma_device_type": 2 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "dma_device_id": "system", 00:18:15.471 "dma_device_type": 1 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.471 "dma_device_type": 2 00:18:15.471 } 00:18:15.471 ], 00:18:15.471 "driver_specific": { 00:18:15.471 "raid": { 00:18:15.471 "uuid": "0449c1ef-0919-44e6-a70d-9f53c21f55df", 00:18:15.471 "strip_size_kb": 64, 00:18:15.471 "state": "online", 00:18:15.471 "raid_level": "concat", 00:18:15.471 "superblock": false, 00:18:15.471 "num_base_bdevs": 4, 00:18:15.471 "num_base_bdevs_discovered": 4, 00:18:15.471 "num_base_bdevs_operational": 4, 00:18:15.471 "base_bdevs_list": [ 00:18:15.471 { 00:18:15.471 "name": "NewBaseBdev", 00:18:15.471 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:15.471 "is_configured": true, 00:18:15.471 "data_offset": 0, 00:18:15.471 "data_size": 65536 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "name": "BaseBdev2", 00:18:15.471 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:15.471 "is_configured": true, 00:18:15.471 "data_offset": 0, 00:18:15.471 "data_size": 65536 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "name": "BaseBdev3", 00:18:15.471 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:15.471 "is_configured": true, 00:18:15.471 "data_offset": 0, 00:18:15.471 "data_size": 65536 00:18:15.471 }, 00:18:15.471 { 00:18:15.471 "name": "BaseBdev4", 00:18:15.471 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:15.471 "is_configured": true, 00:18:15.471 "data_offset": 0, 00:18:15.471 "data_size": 65536 00:18:15.471 } 00:18:15.471 ] 00:18:15.471 } 00:18:15.471 } 00:18:15.471 }' 00:18:15.471 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:15.471 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:15.471 BaseBdev2 00:18:15.471 BaseBdev3 00:18:15.471 BaseBdev4' 00:18:15.471 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:15.471 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:15.471 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:15.731 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:15.731 "name": "NewBaseBdev", 00:18:15.731 "aliases": [ 00:18:15.731 "204957ae-19ea-4d33-afc8-3d4b5cedba1d" 00:18:15.731 ], 00:18:15.731 "product_name": "Malloc disk", 00:18:15.731 "block_size": 512, 00:18:15.731 "num_blocks": 65536, 00:18:15.731 "uuid": "204957ae-19ea-4d33-afc8-3d4b5cedba1d", 00:18:15.731 "assigned_rate_limits": { 00:18:15.731 "rw_ios_per_sec": 0, 00:18:15.731 "rw_mbytes_per_sec": 0, 00:18:15.731 "r_mbytes_per_sec": 0, 00:18:15.731 "w_mbytes_per_sec": 0 00:18:15.731 }, 00:18:15.731 "claimed": true, 00:18:15.731 "claim_type": "exclusive_write", 00:18:15.731 "zoned": false, 00:18:15.731 "supported_io_types": { 00:18:15.731 "read": true, 00:18:15.731 "write": true, 00:18:15.731 "unmap": true, 00:18:15.731 "flush": true, 00:18:15.731 "reset": true, 00:18:15.731 "nvme_admin": false, 00:18:15.731 "nvme_io": false, 00:18:15.731 "nvme_io_md": false, 00:18:15.731 "write_zeroes": true, 00:18:15.731 "zcopy": true, 00:18:15.731 "get_zone_info": false, 00:18:15.731 "zone_management": false, 00:18:15.731 "zone_append": false, 00:18:15.731 "compare": false, 00:18:15.731 "compare_and_write": false, 00:18:15.731 "abort": true, 00:18:15.731 "seek_hole": false, 00:18:15.731 "seek_data": false, 00:18:15.731 "copy": true, 00:18:15.731 "nvme_iov_md": false 00:18:15.731 }, 00:18:15.731 "memory_domains": [ 00:18:15.731 { 00:18:15.731 "dma_device_id": "system", 00:18:15.731 "dma_device_type": 1 00:18:15.731 }, 00:18:15.731 { 00:18:15.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.731 "dma_device_type": 2 00:18:15.731 } 00:18:15.731 ], 00:18:15.731 "driver_specific": {} 00:18:15.731 }' 00:18:15.731 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.731 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.731 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:15.731 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:15.992 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:16.252 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:16.252 "name": "BaseBdev2", 00:18:16.252 "aliases": [ 00:18:16.252 "52e1c1a1-8a2a-446e-abbc-b98a9027f45c" 00:18:16.252 ], 00:18:16.252 "product_name": "Malloc disk", 00:18:16.252 "block_size": 512, 00:18:16.252 "num_blocks": 65536, 00:18:16.252 "uuid": "52e1c1a1-8a2a-446e-abbc-b98a9027f45c", 00:18:16.252 "assigned_rate_limits": { 00:18:16.252 "rw_ios_per_sec": 0, 00:18:16.252 "rw_mbytes_per_sec": 0, 00:18:16.252 "r_mbytes_per_sec": 0, 00:18:16.252 "w_mbytes_per_sec": 0 00:18:16.252 }, 00:18:16.252 "claimed": true, 00:18:16.252 "claim_type": "exclusive_write", 00:18:16.252 "zoned": false, 00:18:16.252 "supported_io_types": { 00:18:16.252 "read": true, 00:18:16.252 "write": true, 00:18:16.252 "unmap": true, 00:18:16.252 "flush": true, 00:18:16.252 "reset": true, 00:18:16.252 "nvme_admin": false, 00:18:16.252 "nvme_io": false, 00:18:16.252 "nvme_io_md": false, 00:18:16.252 "write_zeroes": true, 00:18:16.252 "zcopy": true, 00:18:16.252 "get_zone_info": false, 00:18:16.252 "zone_management": false, 00:18:16.252 "zone_append": false, 00:18:16.252 "compare": false, 00:18:16.252 "compare_and_write": false, 00:18:16.252 "abort": true, 00:18:16.252 "seek_hole": false, 00:18:16.252 "seek_data": false, 00:18:16.252 "copy": true, 00:18:16.252 "nvme_iov_md": false 00:18:16.252 }, 00:18:16.252 "memory_domains": [ 00:18:16.252 { 00:18:16.252 "dma_device_id": "system", 00:18:16.252 "dma_device_type": 1 00:18:16.252 }, 00:18:16.252 { 00:18:16.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.252 "dma_device_type": 2 00:18:16.252 } 00:18:16.252 ], 00:18:16.252 "driver_specific": {} 00:18:16.252 }' 00:18:16.252 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.252 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.252 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:16.252 07:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:16.513 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:16.774 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:16.774 "name": "BaseBdev3", 00:18:16.774 "aliases": [ 00:18:16.774 "a0431b84-a725-49ef-872c-a0367ff377b6" 00:18:16.774 ], 00:18:16.774 "product_name": "Malloc disk", 00:18:16.774 "block_size": 512, 00:18:16.774 "num_blocks": 65536, 00:18:16.774 "uuid": "a0431b84-a725-49ef-872c-a0367ff377b6", 00:18:16.774 "assigned_rate_limits": { 00:18:16.774 "rw_ios_per_sec": 0, 00:18:16.774 "rw_mbytes_per_sec": 0, 00:18:16.774 "r_mbytes_per_sec": 0, 00:18:16.774 "w_mbytes_per_sec": 0 00:18:16.774 }, 00:18:16.774 "claimed": true, 00:18:16.774 "claim_type": "exclusive_write", 00:18:16.774 "zoned": false, 00:18:16.774 "supported_io_types": { 00:18:16.774 "read": true, 00:18:16.774 "write": true, 00:18:16.774 "unmap": true, 00:18:16.774 "flush": true, 00:18:16.774 "reset": true, 00:18:16.774 "nvme_admin": false, 00:18:16.774 "nvme_io": false, 00:18:16.774 "nvme_io_md": false, 00:18:16.774 "write_zeroes": true, 00:18:16.774 "zcopy": true, 00:18:16.774 "get_zone_info": false, 00:18:16.774 "zone_management": false, 00:18:16.774 "zone_append": false, 00:18:16.774 "compare": false, 00:18:16.774 "compare_and_write": false, 00:18:16.774 "abort": true, 00:18:16.774 "seek_hole": false, 00:18:16.774 "seek_data": false, 00:18:16.774 "copy": true, 00:18:16.774 "nvme_iov_md": false 00:18:16.774 }, 00:18:16.774 "memory_domains": [ 00:18:16.774 { 00:18:16.774 "dma_device_id": "system", 00:18:16.774 "dma_device_type": 1 00:18:16.774 }, 00:18:16.774 { 00:18:16.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.774 "dma_device_type": 2 00:18:16.774 } 00:18:16.774 ], 00:18:16.774 "driver_specific": {} 00:18:16.774 }' 00:18:16.774 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:16.774 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.034 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.034 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.034 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.034 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.034 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.034 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.294 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.295 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.295 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:17.295 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:17.295 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.295 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:17.295 07:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.865 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.865 "name": "BaseBdev4", 00:18:17.865 "aliases": [ 00:18:17.865 "99e24c18-ef1d-4d42-87e9-d8a2afc11946" 00:18:17.865 ], 00:18:17.865 "product_name": "Malloc disk", 00:18:17.865 "block_size": 512, 00:18:17.865 "num_blocks": 65536, 00:18:17.865 "uuid": "99e24c18-ef1d-4d42-87e9-d8a2afc11946", 00:18:17.865 "assigned_rate_limits": { 00:18:17.865 "rw_ios_per_sec": 0, 00:18:17.865 "rw_mbytes_per_sec": 0, 00:18:17.865 "r_mbytes_per_sec": 0, 00:18:17.865 "w_mbytes_per_sec": 0 00:18:17.865 }, 00:18:17.865 "claimed": true, 00:18:17.865 "claim_type": "exclusive_write", 00:18:17.865 "zoned": false, 00:18:17.865 "supported_io_types": { 00:18:17.865 "read": true, 00:18:17.865 "write": true, 00:18:17.865 "unmap": true, 00:18:17.865 "flush": true, 00:18:17.865 "reset": true, 00:18:17.865 "nvme_admin": false, 00:18:17.865 "nvme_io": false, 00:18:17.865 "nvme_io_md": false, 00:18:17.866 "write_zeroes": true, 00:18:17.866 "zcopy": true, 00:18:17.866 "get_zone_info": false, 00:18:17.866 "zone_management": false, 00:18:17.866 "zone_append": false, 00:18:17.866 "compare": false, 00:18:17.866 "compare_and_write": false, 00:18:17.866 "abort": true, 00:18:17.866 "seek_hole": false, 00:18:17.866 "seek_data": false, 00:18:17.866 "copy": true, 00:18:17.866 "nvme_iov_md": false 00:18:17.866 }, 00:18:17.866 "memory_domains": [ 00:18:17.866 { 00:18:17.866 "dma_device_id": "system", 00:18:17.866 "dma_device_type": 1 00:18:17.866 }, 00:18:17.866 { 00:18:17.866 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.866 "dma_device_type": 2 00:18:17.866 } 00:18:17.866 ], 00:18:17.866 "driver_specific": {} 00:18:17.866 }' 00:18:17.866 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.866 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.126 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.126 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.126 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.126 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.126 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.126 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.385 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.385 07:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.385 07:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.385 07:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.385 07:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:18.649 [2024-07-15 07:54:03.276163] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:18.649 [2024-07-15 07:54:03.276180] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:18.649 [2024-07-15 07:54:03.276220] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:18.649 [2024-07-15 07:54:03.276263] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:18.649 [2024-07-15 07:54:03.276269] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2793ba0 name Existed_Raid, state offline 00:18:18.649 07:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1670166 00:18:18.649 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1670166 ']' 00:18:18.649 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1670166 00:18:18.649 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:18.650 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:18.650 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1670166 00:18:18.650 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:18.650 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:18.650 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1670166' 00:18:18.650 killing process with pid 1670166 00:18:18.650 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1670166 00:18:18.650 [2024-07-15 07:54:03.366496] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:18.650 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1670166 00:18:18.650 [2024-07-15 07:54:03.386739] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:18.911 00:18:18.911 real 0m30.653s 00:18:18.911 user 0m58.059s 00:18:18.911 sys 0m4.122s 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.911 ************************************ 00:18:18.911 END TEST raid_state_function_test 00:18:18.911 ************************************ 00:18:18.911 07:54:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:18.911 07:54:03 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:18:18.911 07:54:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:18.911 07:54:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:18.911 07:54:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:18.911 ************************************ 00:18:18.911 START TEST raid_state_function_test_sb 00:18:18.911 ************************************ 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:18.911 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1675897 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1675897' 00:18:18.912 Process raid pid: 1675897 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1675897 /var/tmp/spdk-raid.sock 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1675897 ']' 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:18.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:18.912 07:54:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:18.912 [2024-07-15 07:54:03.636515] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:18:18.912 [2024-07-15 07:54:03.636566] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:19.171 [2024-07-15 07:54:03.709688] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:19.171 [2024-07-15 07:54:03.778159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:19.171 [2024-07-15 07:54:03.826023] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:19.171 [2024-07-15 07:54:03.826047] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:19.431 07:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:19.431 07:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:19.431 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:19.691 [2024-07-15 07:54:04.337049] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:19.691 [2024-07-15 07:54:04.337077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:19.691 [2024-07-15 07:54:04.337083] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:19.691 [2024-07-15 07:54:04.337089] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:19.691 [2024-07-15 07:54:04.337094] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:19.691 [2024-07-15 07:54:04.337099] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:19.691 [2024-07-15 07:54:04.337103] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:19.691 [2024-07-15 07:54:04.337109] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.691 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:20.259 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.259 "name": "Existed_Raid", 00:18:20.259 "uuid": "0160d089-4af6-4791-99ec-eb1e6fc4fd98", 00:18:20.259 "strip_size_kb": 64, 00:18:20.259 "state": "configuring", 00:18:20.259 "raid_level": "concat", 00:18:20.259 "superblock": true, 00:18:20.259 "num_base_bdevs": 4, 00:18:20.259 "num_base_bdevs_discovered": 0, 00:18:20.259 "num_base_bdevs_operational": 4, 00:18:20.259 "base_bdevs_list": [ 00:18:20.259 { 00:18:20.259 "name": "BaseBdev1", 00:18:20.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.259 "is_configured": false, 00:18:20.259 "data_offset": 0, 00:18:20.259 "data_size": 0 00:18:20.259 }, 00:18:20.259 { 00:18:20.259 "name": "BaseBdev2", 00:18:20.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.259 "is_configured": false, 00:18:20.259 "data_offset": 0, 00:18:20.259 "data_size": 0 00:18:20.259 }, 00:18:20.259 { 00:18:20.259 "name": "BaseBdev3", 00:18:20.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.259 "is_configured": false, 00:18:20.259 "data_offset": 0, 00:18:20.259 "data_size": 0 00:18:20.259 }, 00:18:20.259 { 00:18:20.259 "name": "BaseBdev4", 00:18:20.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.259 "is_configured": false, 00:18:20.259 "data_offset": 0, 00:18:20.259 "data_size": 0 00:18:20.259 } 00:18:20.259 ] 00:18:20.259 }' 00:18:20.259 07:54:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.259 07:54:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:20.827 07:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:21.087 [2024-07-15 07:54:05.688307] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:21.087 [2024-07-15 07:54:05.688326] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d76f0 name Existed_Raid, state configuring 00:18:21.087 07:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:21.347 [2024-07-15 07:54:05.872803] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:21.347 [2024-07-15 07:54:05.872818] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:21.347 [2024-07-15 07:54:05.872822] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:21.347 [2024-07-15 07:54:05.872828] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:21.347 [2024-07-15 07:54:05.872833] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:21.347 [2024-07-15 07:54:05.872838] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:21.347 [2024-07-15 07:54:05.872843] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:21.347 [2024-07-15 07:54:05.872848] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:21.347 07:54:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:21.347 [2024-07-15 07:54:06.071908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:21.347 BaseBdev1 00:18:21.347 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:21.347 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:21.347 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:21.347 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:21.347 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:21.347 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:21.347 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:21.607 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:21.867 [ 00:18:21.867 { 00:18:21.867 "name": "BaseBdev1", 00:18:21.867 "aliases": [ 00:18:21.867 "a1edc924-697d-4092-b8b2-e44833dfc28b" 00:18:21.867 ], 00:18:21.867 "product_name": "Malloc disk", 00:18:21.867 "block_size": 512, 00:18:21.867 "num_blocks": 65536, 00:18:21.867 "uuid": "a1edc924-697d-4092-b8b2-e44833dfc28b", 00:18:21.867 "assigned_rate_limits": { 00:18:21.867 "rw_ios_per_sec": 0, 00:18:21.867 "rw_mbytes_per_sec": 0, 00:18:21.867 "r_mbytes_per_sec": 0, 00:18:21.867 "w_mbytes_per_sec": 0 00:18:21.867 }, 00:18:21.867 "claimed": true, 00:18:21.867 "claim_type": "exclusive_write", 00:18:21.867 "zoned": false, 00:18:21.867 "supported_io_types": { 00:18:21.867 "read": true, 00:18:21.867 "write": true, 00:18:21.867 "unmap": true, 00:18:21.867 "flush": true, 00:18:21.867 "reset": true, 00:18:21.867 "nvme_admin": false, 00:18:21.867 "nvme_io": false, 00:18:21.867 "nvme_io_md": false, 00:18:21.867 "write_zeroes": true, 00:18:21.867 "zcopy": true, 00:18:21.867 "get_zone_info": false, 00:18:21.867 "zone_management": false, 00:18:21.867 "zone_append": false, 00:18:21.867 "compare": false, 00:18:21.867 "compare_and_write": false, 00:18:21.867 "abort": true, 00:18:21.867 "seek_hole": false, 00:18:21.867 "seek_data": false, 00:18:21.867 "copy": true, 00:18:21.867 "nvme_iov_md": false 00:18:21.867 }, 00:18:21.867 "memory_domains": [ 00:18:21.867 { 00:18:21.867 "dma_device_id": "system", 00:18:21.867 "dma_device_type": 1 00:18:21.867 }, 00:18:21.867 { 00:18:21.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.867 "dma_device_type": 2 00:18:21.867 } 00:18:21.867 ], 00:18:21.867 "driver_specific": {} 00:18:21.867 } 00:18:21.867 ] 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.867 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:22.159 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.159 "name": "Existed_Raid", 00:18:22.159 "uuid": "1833d346-d932-4799-950a-819f3461c70d", 00:18:22.159 "strip_size_kb": 64, 00:18:22.159 "state": "configuring", 00:18:22.159 "raid_level": "concat", 00:18:22.159 "superblock": true, 00:18:22.159 "num_base_bdevs": 4, 00:18:22.159 "num_base_bdevs_discovered": 1, 00:18:22.159 "num_base_bdevs_operational": 4, 00:18:22.159 "base_bdevs_list": [ 00:18:22.159 { 00:18:22.159 "name": "BaseBdev1", 00:18:22.159 "uuid": "a1edc924-697d-4092-b8b2-e44833dfc28b", 00:18:22.159 "is_configured": true, 00:18:22.159 "data_offset": 2048, 00:18:22.159 "data_size": 63488 00:18:22.159 }, 00:18:22.159 { 00:18:22.159 "name": "BaseBdev2", 00:18:22.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.159 "is_configured": false, 00:18:22.159 "data_offset": 0, 00:18:22.159 "data_size": 0 00:18:22.159 }, 00:18:22.159 { 00:18:22.159 "name": "BaseBdev3", 00:18:22.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.159 "is_configured": false, 00:18:22.160 "data_offset": 0, 00:18:22.160 "data_size": 0 00:18:22.160 }, 00:18:22.160 { 00:18:22.160 "name": "BaseBdev4", 00:18:22.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.160 "is_configured": false, 00:18:22.160 "data_offset": 0, 00:18:22.160 "data_size": 0 00:18:22.160 } 00:18:22.160 ] 00:18:22.160 }' 00:18:22.160 07:54:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.160 07:54:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:23.115 07:54:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:23.375 [2024-07-15 07:54:08.068962] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:23.375 [2024-07-15 07:54:08.068988] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d6f60 name Existed_Raid, state configuring 00:18:23.375 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:23.944 [2024-07-15 07:54:08.610342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:23.944 [2024-07-15 07:54:08.611465] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:23.944 [2024-07-15 07:54:08.611488] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:23.944 [2024-07-15 07:54:08.611494] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:23.944 [2024-07-15 07:54:08.611500] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:23.944 [2024-07-15 07:54:08.611505] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:23.944 [2024-07-15 07:54:08.611511] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.944 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:24.204 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.204 "name": "Existed_Raid", 00:18:24.204 "uuid": "c1d8c18f-f093-4c17-b7c7-7f7182b6bc2f", 00:18:24.204 "strip_size_kb": 64, 00:18:24.204 "state": "configuring", 00:18:24.204 "raid_level": "concat", 00:18:24.204 "superblock": true, 00:18:24.204 "num_base_bdevs": 4, 00:18:24.204 "num_base_bdevs_discovered": 1, 00:18:24.204 "num_base_bdevs_operational": 4, 00:18:24.204 "base_bdevs_list": [ 00:18:24.204 { 00:18:24.204 "name": "BaseBdev1", 00:18:24.204 "uuid": "a1edc924-697d-4092-b8b2-e44833dfc28b", 00:18:24.204 "is_configured": true, 00:18:24.204 "data_offset": 2048, 00:18:24.204 "data_size": 63488 00:18:24.204 }, 00:18:24.204 { 00:18:24.204 "name": "BaseBdev2", 00:18:24.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.204 "is_configured": false, 00:18:24.204 "data_offset": 0, 00:18:24.204 "data_size": 0 00:18:24.204 }, 00:18:24.204 { 00:18:24.204 "name": "BaseBdev3", 00:18:24.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.204 "is_configured": false, 00:18:24.204 "data_offset": 0, 00:18:24.204 "data_size": 0 00:18:24.204 }, 00:18:24.204 { 00:18:24.204 "name": "BaseBdev4", 00:18:24.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.204 "is_configured": false, 00:18:24.204 "data_offset": 0, 00:18:24.204 "data_size": 0 00:18:24.204 } 00:18:24.204 ] 00:18:24.204 }' 00:18:24.204 07:54:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.205 07:54:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:24.774 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:24.774 [2024-07-15 07:54:09.493561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:24.774 BaseBdev2 00:18:24.774 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:24.774 07:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:24.774 07:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:24.774 07:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:24.774 07:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:24.774 07:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:24.774 07:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:25.034 07:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:25.293 [ 00:18:25.293 { 00:18:25.293 "name": "BaseBdev2", 00:18:25.293 "aliases": [ 00:18:25.293 "847fb1ed-f301-43ab-b827-ed9f32a12de8" 00:18:25.293 ], 00:18:25.293 "product_name": "Malloc disk", 00:18:25.293 "block_size": 512, 00:18:25.293 "num_blocks": 65536, 00:18:25.293 "uuid": "847fb1ed-f301-43ab-b827-ed9f32a12de8", 00:18:25.293 "assigned_rate_limits": { 00:18:25.293 "rw_ios_per_sec": 0, 00:18:25.293 "rw_mbytes_per_sec": 0, 00:18:25.293 "r_mbytes_per_sec": 0, 00:18:25.293 "w_mbytes_per_sec": 0 00:18:25.293 }, 00:18:25.293 "claimed": true, 00:18:25.293 "claim_type": "exclusive_write", 00:18:25.293 "zoned": false, 00:18:25.293 "supported_io_types": { 00:18:25.293 "read": true, 00:18:25.293 "write": true, 00:18:25.293 "unmap": true, 00:18:25.293 "flush": true, 00:18:25.293 "reset": true, 00:18:25.293 "nvme_admin": false, 00:18:25.293 "nvme_io": false, 00:18:25.293 "nvme_io_md": false, 00:18:25.293 "write_zeroes": true, 00:18:25.293 "zcopy": true, 00:18:25.293 "get_zone_info": false, 00:18:25.293 "zone_management": false, 00:18:25.293 "zone_append": false, 00:18:25.293 "compare": false, 00:18:25.293 "compare_and_write": false, 00:18:25.293 "abort": true, 00:18:25.293 "seek_hole": false, 00:18:25.293 "seek_data": false, 00:18:25.293 "copy": true, 00:18:25.293 "nvme_iov_md": false 00:18:25.293 }, 00:18:25.293 "memory_domains": [ 00:18:25.293 { 00:18:25.293 "dma_device_id": "system", 00:18:25.293 "dma_device_type": 1 00:18:25.293 }, 00:18:25.293 { 00:18:25.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.293 "dma_device_type": 2 00:18:25.293 } 00:18:25.293 ], 00:18:25.293 "driver_specific": {} 00:18:25.293 } 00:18:25.293 ] 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.293 07:54:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.553 07:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.553 "name": "Existed_Raid", 00:18:25.553 "uuid": "c1d8c18f-f093-4c17-b7c7-7f7182b6bc2f", 00:18:25.553 "strip_size_kb": 64, 00:18:25.553 "state": "configuring", 00:18:25.553 "raid_level": "concat", 00:18:25.553 "superblock": true, 00:18:25.553 "num_base_bdevs": 4, 00:18:25.553 "num_base_bdevs_discovered": 2, 00:18:25.553 "num_base_bdevs_operational": 4, 00:18:25.553 "base_bdevs_list": [ 00:18:25.553 { 00:18:25.553 "name": "BaseBdev1", 00:18:25.553 "uuid": "a1edc924-697d-4092-b8b2-e44833dfc28b", 00:18:25.553 "is_configured": true, 00:18:25.553 "data_offset": 2048, 00:18:25.553 "data_size": 63488 00:18:25.553 }, 00:18:25.553 { 00:18:25.553 "name": "BaseBdev2", 00:18:25.553 "uuid": "847fb1ed-f301-43ab-b827-ed9f32a12de8", 00:18:25.553 "is_configured": true, 00:18:25.553 "data_offset": 2048, 00:18:25.553 "data_size": 63488 00:18:25.553 }, 00:18:25.553 { 00:18:25.553 "name": "BaseBdev3", 00:18:25.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.553 "is_configured": false, 00:18:25.553 "data_offset": 0, 00:18:25.553 "data_size": 0 00:18:25.553 }, 00:18:25.553 { 00:18:25.553 "name": "BaseBdev4", 00:18:25.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.553 "is_configured": false, 00:18:25.553 "data_offset": 0, 00:18:25.553 "data_size": 0 00:18:25.553 } 00:18:25.553 ] 00:18:25.553 }' 00:18:25.553 07:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.553 07:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.122 07:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:26.122 [2024-07-15 07:54:10.785692] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:26.122 BaseBdev3 00:18:26.122 07:54:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:26.122 07:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:26.122 07:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:26.122 07:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:26.122 07:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:26.122 07:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:26.122 07:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:26.381 07:54:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:26.642 [ 00:18:26.642 { 00:18:26.642 "name": "BaseBdev3", 00:18:26.642 "aliases": [ 00:18:26.642 "a39f0d0f-f305-485b-a6de-6c863a502fa5" 00:18:26.642 ], 00:18:26.642 "product_name": "Malloc disk", 00:18:26.642 "block_size": 512, 00:18:26.642 "num_blocks": 65536, 00:18:26.642 "uuid": "a39f0d0f-f305-485b-a6de-6c863a502fa5", 00:18:26.642 "assigned_rate_limits": { 00:18:26.642 "rw_ios_per_sec": 0, 00:18:26.642 "rw_mbytes_per_sec": 0, 00:18:26.642 "r_mbytes_per_sec": 0, 00:18:26.642 "w_mbytes_per_sec": 0 00:18:26.642 }, 00:18:26.642 "claimed": true, 00:18:26.642 "claim_type": "exclusive_write", 00:18:26.642 "zoned": false, 00:18:26.642 "supported_io_types": { 00:18:26.642 "read": true, 00:18:26.642 "write": true, 00:18:26.642 "unmap": true, 00:18:26.642 "flush": true, 00:18:26.642 "reset": true, 00:18:26.642 "nvme_admin": false, 00:18:26.642 "nvme_io": false, 00:18:26.642 "nvme_io_md": false, 00:18:26.642 "write_zeroes": true, 00:18:26.642 "zcopy": true, 00:18:26.642 "get_zone_info": false, 00:18:26.642 "zone_management": false, 00:18:26.642 "zone_append": false, 00:18:26.642 "compare": false, 00:18:26.642 "compare_and_write": false, 00:18:26.642 "abort": true, 00:18:26.642 "seek_hole": false, 00:18:26.642 "seek_data": false, 00:18:26.642 "copy": true, 00:18:26.642 "nvme_iov_md": false 00:18:26.642 }, 00:18:26.642 "memory_domains": [ 00:18:26.642 { 00:18:26.642 "dma_device_id": "system", 00:18:26.642 "dma_device_type": 1 00:18:26.642 }, 00:18:26.642 { 00:18:26.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:26.642 "dma_device_type": 2 00:18:26.642 } 00:18:26.642 ], 00:18:26.642 "driver_specific": {} 00:18:26.642 } 00:18:26.642 ] 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.642 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.642 "name": "Existed_Raid", 00:18:26.642 "uuid": "c1d8c18f-f093-4c17-b7c7-7f7182b6bc2f", 00:18:26.642 "strip_size_kb": 64, 00:18:26.642 "state": "configuring", 00:18:26.642 "raid_level": "concat", 00:18:26.642 "superblock": true, 00:18:26.642 "num_base_bdevs": 4, 00:18:26.642 "num_base_bdevs_discovered": 3, 00:18:26.642 "num_base_bdevs_operational": 4, 00:18:26.642 "base_bdevs_list": [ 00:18:26.642 { 00:18:26.642 "name": "BaseBdev1", 00:18:26.642 "uuid": "a1edc924-697d-4092-b8b2-e44833dfc28b", 00:18:26.642 "is_configured": true, 00:18:26.642 "data_offset": 2048, 00:18:26.642 "data_size": 63488 00:18:26.642 }, 00:18:26.642 { 00:18:26.642 "name": "BaseBdev2", 00:18:26.642 "uuid": "847fb1ed-f301-43ab-b827-ed9f32a12de8", 00:18:26.642 "is_configured": true, 00:18:26.642 "data_offset": 2048, 00:18:26.642 "data_size": 63488 00:18:26.642 }, 00:18:26.642 { 00:18:26.642 "name": "BaseBdev3", 00:18:26.642 "uuid": "a39f0d0f-f305-485b-a6de-6c863a502fa5", 00:18:26.642 "is_configured": true, 00:18:26.642 "data_offset": 2048, 00:18:26.642 "data_size": 63488 00:18:26.642 }, 00:18:26.642 { 00:18:26.642 "name": "BaseBdev4", 00:18:26.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.643 "is_configured": false, 00:18:26.643 "data_offset": 0, 00:18:26.643 "data_size": 0 00:18:26.643 } 00:18:26.643 ] 00:18:26.643 }' 00:18:26.643 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.643 07:54:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.213 07:54:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:27.473 [2024-07-15 07:54:12.110091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:27.473 [2024-07-15 07:54:12.110215] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21d7fc0 00:18:27.473 [2024-07-15 07:54:12.110222] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:27.473 [2024-07-15 07:54:12.110359] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d7c00 00:18:27.473 [2024-07-15 07:54:12.110454] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21d7fc0 00:18:27.473 [2024-07-15 07:54:12.110460] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21d7fc0 00:18:27.473 [2024-07-15 07:54:12.110526] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:27.473 BaseBdev4 00:18:27.473 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:27.473 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:27.473 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:27.473 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:27.473 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:27.473 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:27.473 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:27.732 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:27.992 [ 00:18:27.992 { 00:18:27.992 "name": "BaseBdev4", 00:18:27.992 "aliases": [ 00:18:27.992 "2686aac4-7e2c-46dc-95e7-a35bc58adbd5" 00:18:27.992 ], 00:18:27.992 "product_name": "Malloc disk", 00:18:27.992 "block_size": 512, 00:18:27.992 "num_blocks": 65536, 00:18:27.992 "uuid": "2686aac4-7e2c-46dc-95e7-a35bc58adbd5", 00:18:27.992 "assigned_rate_limits": { 00:18:27.992 "rw_ios_per_sec": 0, 00:18:27.992 "rw_mbytes_per_sec": 0, 00:18:27.992 "r_mbytes_per_sec": 0, 00:18:27.992 "w_mbytes_per_sec": 0 00:18:27.992 }, 00:18:27.992 "claimed": true, 00:18:27.992 "claim_type": "exclusive_write", 00:18:27.992 "zoned": false, 00:18:27.992 "supported_io_types": { 00:18:27.992 "read": true, 00:18:27.992 "write": true, 00:18:27.992 "unmap": true, 00:18:27.992 "flush": true, 00:18:27.992 "reset": true, 00:18:27.992 "nvme_admin": false, 00:18:27.992 "nvme_io": false, 00:18:27.992 "nvme_io_md": false, 00:18:27.992 "write_zeroes": true, 00:18:27.992 "zcopy": true, 00:18:27.992 "get_zone_info": false, 00:18:27.992 "zone_management": false, 00:18:27.992 "zone_append": false, 00:18:27.992 "compare": false, 00:18:27.992 "compare_and_write": false, 00:18:27.992 "abort": true, 00:18:27.992 "seek_hole": false, 00:18:27.992 "seek_data": false, 00:18:27.992 "copy": true, 00:18:27.992 "nvme_iov_md": false 00:18:27.992 }, 00:18:27.992 "memory_domains": [ 00:18:27.992 { 00:18:27.992 "dma_device_id": "system", 00:18:27.992 "dma_device_type": 1 00:18:27.992 }, 00:18:27.992 { 00:18:27.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:27.992 "dma_device_type": 2 00:18:27.992 } 00:18:27.992 ], 00:18:27.992 "driver_specific": {} 00:18:27.992 } 00:18:27.992 ] 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.992 "name": "Existed_Raid", 00:18:27.992 "uuid": "c1d8c18f-f093-4c17-b7c7-7f7182b6bc2f", 00:18:27.992 "strip_size_kb": 64, 00:18:27.992 "state": "online", 00:18:27.992 "raid_level": "concat", 00:18:27.992 "superblock": true, 00:18:27.992 "num_base_bdevs": 4, 00:18:27.992 "num_base_bdevs_discovered": 4, 00:18:27.992 "num_base_bdevs_operational": 4, 00:18:27.992 "base_bdevs_list": [ 00:18:27.992 { 00:18:27.992 "name": "BaseBdev1", 00:18:27.992 "uuid": "a1edc924-697d-4092-b8b2-e44833dfc28b", 00:18:27.992 "is_configured": true, 00:18:27.992 "data_offset": 2048, 00:18:27.992 "data_size": 63488 00:18:27.992 }, 00:18:27.992 { 00:18:27.992 "name": "BaseBdev2", 00:18:27.992 "uuid": "847fb1ed-f301-43ab-b827-ed9f32a12de8", 00:18:27.992 "is_configured": true, 00:18:27.992 "data_offset": 2048, 00:18:27.992 "data_size": 63488 00:18:27.992 }, 00:18:27.992 { 00:18:27.992 "name": "BaseBdev3", 00:18:27.992 "uuid": "a39f0d0f-f305-485b-a6de-6c863a502fa5", 00:18:27.992 "is_configured": true, 00:18:27.992 "data_offset": 2048, 00:18:27.992 "data_size": 63488 00:18:27.992 }, 00:18:27.992 { 00:18:27.992 "name": "BaseBdev4", 00:18:27.992 "uuid": "2686aac4-7e2c-46dc-95e7-a35bc58adbd5", 00:18:27.992 "is_configured": true, 00:18:27.992 "data_offset": 2048, 00:18:27.992 "data_size": 63488 00:18:27.992 } 00:18:27.992 ] 00:18:27.992 }' 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.992 07:54:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:28.562 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:28.562 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:28.562 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:28.562 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:28.562 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:28.562 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:28.562 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:28.562 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:28.822 [2024-07-15 07:54:13.425680] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:28.822 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:28.822 "name": "Existed_Raid", 00:18:28.822 "aliases": [ 00:18:28.822 "c1d8c18f-f093-4c17-b7c7-7f7182b6bc2f" 00:18:28.822 ], 00:18:28.822 "product_name": "Raid Volume", 00:18:28.822 "block_size": 512, 00:18:28.822 "num_blocks": 253952, 00:18:28.822 "uuid": "c1d8c18f-f093-4c17-b7c7-7f7182b6bc2f", 00:18:28.822 "assigned_rate_limits": { 00:18:28.822 "rw_ios_per_sec": 0, 00:18:28.822 "rw_mbytes_per_sec": 0, 00:18:28.822 "r_mbytes_per_sec": 0, 00:18:28.822 "w_mbytes_per_sec": 0 00:18:28.822 }, 00:18:28.822 "claimed": false, 00:18:28.822 "zoned": false, 00:18:28.822 "supported_io_types": { 00:18:28.822 "read": true, 00:18:28.822 "write": true, 00:18:28.822 "unmap": true, 00:18:28.822 "flush": true, 00:18:28.822 "reset": true, 00:18:28.822 "nvme_admin": false, 00:18:28.822 "nvme_io": false, 00:18:28.822 "nvme_io_md": false, 00:18:28.822 "write_zeroes": true, 00:18:28.822 "zcopy": false, 00:18:28.822 "get_zone_info": false, 00:18:28.822 "zone_management": false, 00:18:28.822 "zone_append": false, 00:18:28.822 "compare": false, 00:18:28.822 "compare_and_write": false, 00:18:28.822 "abort": false, 00:18:28.822 "seek_hole": false, 00:18:28.822 "seek_data": false, 00:18:28.822 "copy": false, 00:18:28.822 "nvme_iov_md": false 00:18:28.822 }, 00:18:28.822 "memory_domains": [ 00:18:28.822 { 00:18:28.822 "dma_device_id": "system", 00:18:28.822 "dma_device_type": 1 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.822 "dma_device_type": 2 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "dma_device_id": "system", 00:18:28.822 "dma_device_type": 1 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.822 "dma_device_type": 2 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "dma_device_id": "system", 00:18:28.822 "dma_device_type": 1 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.822 "dma_device_type": 2 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "dma_device_id": "system", 00:18:28.822 "dma_device_type": 1 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.822 "dma_device_type": 2 00:18:28.822 } 00:18:28.822 ], 00:18:28.822 "driver_specific": { 00:18:28.822 "raid": { 00:18:28.822 "uuid": "c1d8c18f-f093-4c17-b7c7-7f7182b6bc2f", 00:18:28.822 "strip_size_kb": 64, 00:18:28.822 "state": "online", 00:18:28.822 "raid_level": "concat", 00:18:28.822 "superblock": true, 00:18:28.822 "num_base_bdevs": 4, 00:18:28.822 "num_base_bdevs_discovered": 4, 00:18:28.822 "num_base_bdevs_operational": 4, 00:18:28.822 "base_bdevs_list": [ 00:18:28.822 { 00:18:28.822 "name": "BaseBdev1", 00:18:28.822 "uuid": "a1edc924-697d-4092-b8b2-e44833dfc28b", 00:18:28.822 "is_configured": true, 00:18:28.822 "data_offset": 2048, 00:18:28.822 "data_size": 63488 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "name": "BaseBdev2", 00:18:28.822 "uuid": "847fb1ed-f301-43ab-b827-ed9f32a12de8", 00:18:28.822 "is_configured": true, 00:18:28.822 "data_offset": 2048, 00:18:28.822 "data_size": 63488 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "name": "BaseBdev3", 00:18:28.822 "uuid": "a39f0d0f-f305-485b-a6de-6c863a502fa5", 00:18:28.822 "is_configured": true, 00:18:28.822 "data_offset": 2048, 00:18:28.822 "data_size": 63488 00:18:28.822 }, 00:18:28.822 { 00:18:28.822 "name": "BaseBdev4", 00:18:28.822 "uuid": "2686aac4-7e2c-46dc-95e7-a35bc58adbd5", 00:18:28.822 "is_configured": true, 00:18:28.822 "data_offset": 2048, 00:18:28.822 "data_size": 63488 00:18:28.822 } 00:18:28.822 ] 00:18:28.822 } 00:18:28.822 } 00:18:28.822 }' 00:18:28.822 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:28.822 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:28.822 BaseBdev2 00:18:28.822 BaseBdev3 00:18:28.822 BaseBdev4' 00:18:28.822 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:28.822 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:28.822 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:29.082 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:29.082 "name": "BaseBdev1", 00:18:29.082 "aliases": [ 00:18:29.082 "a1edc924-697d-4092-b8b2-e44833dfc28b" 00:18:29.082 ], 00:18:29.082 "product_name": "Malloc disk", 00:18:29.082 "block_size": 512, 00:18:29.082 "num_blocks": 65536, 00:18:29.082 "uuid": "a1edc924-697d-4092-b8b2-e44833dfc28b", 00:18:29.082 "assigned_rate_limits": { 00:18:29.082 "rw_ios_per_sec": 0, 00:18:29.082 "rw_mbytes_per_sec": 0, 00:18:29.082 "r_mbytes_per_sec": 0, 00:18:29.082 "w_mbytes_per_sec": 0 00:18:29.082 }, 00:18:29.082 "claimed": true, 00:18:29.082 "claim_type": "exclusive_write", 00:18:29.082 "zoned": false, 00:18:29.082 "supported_io_types": { 00:18:29.082 "read": true, 00:18:29.082 "write": true, 00:18:29.082 "unmap": true, 00:18:29.082 "flush": true, 00:18:29.082 "reset": true, 00:18:29.082 "nvme_admin": false, 00:18:29.082 "nvme_io": false, 00:18:29.082 "nvme_io_md": false, 00:18:29.082 "write_zeroes": true, 00:18:29.082 "zcopy": true, 00:18:29.082 "get_zone_info": false, 00:18:29.082 "zone_management": false, 00:18:29.082 "zone_append": false, 00:18:29.082 "compare": false, 00:18:29.082 "compare_and_write": false, 00:18:29.082 "abort": true, 00:18:29.082 "seek_hole": false, 00:18:29.082 "seek_data": false, 00:18:29.082 "copy": true, 00:18:29.082 "nvme_iov_md": false 00:18:29.082 }, 00:18:29.082 "memory_domains": [ 00:18:29.082 { 00:18:29.082 "dma_device_id": "system", 00:18:29.082 "dma_device_type": 1 00:18:29.082 }, 00:18:29.082 { 00:18:29.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.082 "dma_device_type": 2 00:18:29.082 } 00:18:29.082 ], 00:18:29.082 "driver_specific": {} 00:18:29.082 }' 00:18:29.082 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.082 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.082 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:29.082 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.082 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.341 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:29.341 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.341 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.341 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:29.341 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.341 07:54:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.341 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:29.341 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:29.341 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:29.341 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:29.600 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:29.600 "name": "BaseBdev2", 00:18:29.600 "aliases": [ 00:18:29.600 "847fb1ed-f301-43ab-b827-ed9f32a12de8" 00:18:29.600 ], 00:18:29.600 "product_name": "Malloc disk", 00:18:29.600 "block_size": 512, 00:18:29.600 "num_blocks": 65536, 00:18:29.600 "uuid": "847fb1ed-f301-43ab-b827-ed9f32a12de8", 00:18:29.600 "assigned_rate_limits": { 00:18:29.600 "rw_ios_per_sec": 0, 00:18:29.600 "rw_mbytes_per_sec": 0, 00:18:29.600 "r_mbytes_per_sec": 0, 00:18:29.600 "w_mbytes_per_sec": 0 00:18:29.600 }, 00:18:29.600 "claimed": true, 00:18:29.600 "claim_type": "exclusive_write", 00:18:29.600 "zoned": false, 00:18:29.600 "supported_io_types": { 00:18:29.600 "read": true, 00:18:29.600 "write": true, 00:18:29.600 "unmap": true, 00:18:29.600 "flush": true, 00:18:29.600 "reset": true, 00:18:29.600 "nvme_admin": false, 00:18:29.600 "nvme_io": false, 00:18:29.600 "nvme_io_md": false, 00:18:29.600 "write_zeroes": true, 00:18:29.601 "zcopy": true, 00:18:29.601 "get_zone_info": false, 00:18:29.601 "zone_management": false, 00:18:29.601 "zone_append": false, 00:18:29.601 "compare": false, 00:18:29.601 "compare_and_write": false, 00:18:29.601 "abort": true, 00:18:29.601 "seek_hole": false, 00:18:29.601 "seek_data": false, 00:18:29.601 "copy": true, 00:18:29.601 "nvme_iov_md": false 00:18:29.601 }, 00:18:29.601 "memory_domains": [ 00:18:29.601 { 00:18:29.601 "dma_device_id": "system", 00:18:29.601 "dma_device_type": 1 00:18:29.601 }, 00:18:29.601 { 00:18:29.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.601 "dma_device_type": 2 00:18:29.601 } 00:18:29.601 ], 00:18:29.601 "driver_specific": {} 00:18:29.601 }' 00:18:29.601 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.601 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.601 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:29.601 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.601 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:29.860 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.120 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.120 "name": "BaseBdev3", 00:18:30.120 "aliases": [ 00:18:30.120 "a39f0d0f-f305-485b-a6de-6c863a502fa5" 00:18:30.120 ], 00:18:30.120 "product_name": "Malloc disk", 00:18:30.120 "block_size": 512, 00:18:30.120 "num_blocks": 65536, 00:18:30.120 "uuid": "a39f0d0f-f305-485b-a6de-6c863a502fa5", 00:18:30.120 "assigned_rate_limits": { 00:18:30.120 "rw_ios_per_sec": 0, 00:18:30.120 "rw_mbytes_per_sec": 0, 00:18:30.120 "r_mbytes_per_sec": 0, 00:18:30.120 "w_mbytes_per_sec": 0 00:18:30.120 }, 00:18:30.120 "claimed": true, 00:18:30.120 "claim_type": "exclusive_write", 00:18:30.120 "zoned": false, 00:18:30.120 "supported_io_types": { 00:18:30.120 "read": true, 00:18:30.120 "write": true, 00:18:30.120 "unmap": true, 00:18:30.120 "flush": true, 00:18:30.120 "reset": true, 00:18:30.120 "nvme_admin": false, 00:18:30.120 "nvme_io": false, 00:18:30.120 "nvme_io_md": false, 00:18:30.120 "write_zeroes": true, 00:18:30.120 "zcopy": true, 00:18:30.120 "get_zone_info": false, 00:18:30.120 "zone_management": false, 00:18:30.120 "zone_append": false, 00:18:30.120 "compare": false, 00:18:30.120 "compare_and_write": false, 00:18:30.120 "abort": true, 00:18:30.120 "seek_hole": false, 00:18:30.120 "seek_data": false, 00:18:30.120 "copy": true, 00:18:30.120 "nvme_iov_md": false 00:18:30.120 }, 00:18:30.120 "memory_domains": [ 00:18:30.121 { 00:18:30.121 "dma_device_id": "system", 00:18:30.121 "dma_device_type": 1 00:18:30.121 }, 00:18:30.121 { 00:18:30.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.121 "dma_device_type": 2 00:18:30.121 } 00:18:30.121 ], 00:18:30.121 "driver_specific": {} 00:18:30.121 }' 00:18:30.121 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.121 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.121 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.121 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.121 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.381 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:30.381 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.381 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.381 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:30.381 07:54:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.381 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.381 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:30.381 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.381 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:30.381 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.641 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.641 "name": "BaseBdev4", 00:18:30.641 "aliases": [ 00:18:30.641 "2686aac4-7e2c-46dc-95e7-a35bc58adbd5" 00:18:30.641 ], 00:18:30.641 "product_name": "Malloc disk", 00:18:30.641 "block_size": 512, 00:18:30.641 "num_blocks": 65536, 00:18:30.641 "uuid": "2686aac4-7e2c-46dc-95e7-a35bc58adbd5", 00:18:30.641 "assigned_rate_limits": { 00:18:30.641 "rw_ios_per_sec": 0, 00:18:30.641 "rw_mbytes_per_sec": 0, 00:18:30.641 "r_mbytes_per_sec": 0, 00:18:30.641 "w_mbytes_per_sec": 0 00:18:30.641 }, 00:18:30.641 "claimed": true, 00:18:30.641 "claim_type": "exclusive_write", 00:18:30.641 "zoned": false, 00:18:30.641 "supported_io_types": { 00:18:30.641 "read": true, 00:18:30.641 "write": true, 00:18:30.641 "unmap": true, 00:18:30.641 "flush": true, 00:18:30.641 "reset": true, 00:18:30.641 "nvme_admin": false, 00:18:30.641 "nvme_io": false, 00:18:30.641 "nvme_io_md": false, 00:18:30.641 "write_zeroes": true, 00:18:30.641 "zcopy": true, 00:18:30.641 "get_zone_info": false, 00:18:30.641 "zone_management": false, 00:18:30.641 "zone_append": false, 00:18:30.641 "compare": false, 00:18:30.641 "compare_and_write": false, 00:18:30.641 "abort": true, 00:18:30.641 "seek_hole": false, 00:18:30.641 "seek_data": false, 00:18:30.641 "copy": true, 00:18:30.641 "nvme_iov_md": false 00:18:30.641 }, 00:18:30.641 "memory_domains": [ 00:18:30.641 { 00:18:30.641 "dma_device_id": "system", 00:18:30.641 "dma_device_type": 1 00:18:30.642 }, 00:18:30.642 { 00:18:30.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.642 "dma_device_type": 2 00:18:30.642 } 00:18:30.642 ], 00:18:30.642 "driver_specific": {} 00:18:30.642 }' 00:18:30.642 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.642 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.642 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.642 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:30.902 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:31.162 [2024-07-15 07:54:15.799465] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:31.162 [2024-07-15 07:54:15.799483] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:31.162 [2024-07-15 07:54:15.799515] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:31.162 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:31.163 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.163 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.163 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.163 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.163 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.163 07:54:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:31.423 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.423 "name": "Existed_Raid", 00:18:31.423 "uuid": "c1d8c18f-f093-4c17-b7c7-7f7182b6bc2f", 00:18:31.423 "strip_size_kb": 64, 00:18:31.423 "state": "offline", 00:18:31.423 "raid_level": "concat", 00:18:31.423 "superblock": true, 00:18:31.423 "num_base_bdevs": 4, 00:18:31.423 "num_base_bdevs_discovered": 3, 00:18:31.423 "num_base_bdevs_operational": 3, 00:18:31.423 "base_bdevs_list": [ 00:18:31.423 { 00:18:31.423 "name": null, 00:18:31.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.423 "is_configured": false, 00:18:31.423 "data_offset": 2048, 00:18:31.423 "data_size": 63488 00:18:31.423 }, 00:18:31.423 { 00:18:31.423 "name": "BaseBdev2", 00:18:31.423 "uuid": "847fb1ed-f301-43ab-b827-ed9f32a12de8", 00:18:31.423 "is_configured": true, 00:18:31.423 "data_offset": 2048, 00:18:31.423 "data_size": 63488 00:18:31.423 }, 00:18:31.423 { 00:18:31.423 "name": "BaseBdev3", 00:18:31.423 "uuid": "a39f0d0f-f305-485b-a6de-6c863a502fa5", 00:18:31.423 "is_configured": true, 00:18:31.423 "data_offset": 2048, 00:18:31.423 "data_size": 63488 00:18:31.423 }, 00:18:31.423 { 00:18:31.423 "name": "BaseBdev4", 00:18:31.423 "uuid": "2686aac4-7e2c-46dc-95e7-a35bc58adbd5", 00:18:31.423 "is_configured": true, 00:18:31.423 "data_offset": 2048, 00:18:31.423 "data_size": 63488 00:18:31.423 } 00:18:31.423 ] 00:18:31.423 }' 00:18:31.423 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.423 07:54:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.993 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:31.993 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:31.993 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:31.993 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.254 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:32.254 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:32.254 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:32.254 [2024-07-15 07:54:16.942360] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:32.254 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:32.254 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:32.254 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.254 07:54:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:32.515 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:32.515 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:32.515 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:32.775 [2024-07-15 07:54:17.329159] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:32.775 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:32.775 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:32.775 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.775 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:33.035 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:33.035 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:33.035 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:33.035 [2024-07-15 07:54:17.715871] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:33.035 [2024-07-15 07:54:17.715899] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d7fc0 name Existed_Raid, state offline 00:18:33.035 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:33.035 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:33.035 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:33.035 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.295 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:33.295 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:33.295 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:33.295 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:33.295 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:33.295 07:54:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:33.555 BaseBdev2 00:18:33.555 07:54:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:33.555 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:33.555 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:33.555 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:33.555 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:33.555 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:33.555 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:33.555 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:33.815 [ 00:18:33.815 { 00:18:33.815 "name": "BaseBdev2", 00:18:33.815 "aliases": [ 00:18:33.815 "e1b4afce-f675-4e16-a36e-92ef910fc44a" 00:18:33.815 ], 00:18:33.815 "product_name": "Malloc disk", 00:18:33.815 "block_size": 512, 00:18:33.815 "num_blocks": 65536, 00:18:33.815 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:33.815 "assigned_rate_limits": { 00:18:33.815 "rw_ios_per_sec": 0, 00:18:33.815 "rw_mbytes_per_sec": 0, 00:18:33.815 "r_mbytes_per_sec": 0, 00:18:33.815 "w_mbytes_per_sec": 0 00:18:33.815 }, 00:18:33.815 "claimed": false, 00:18:33.815 "zoned": false, 00:18:33.815 "supported_io_types": { 00:18:33.815 "read": true, 00:18:33.815 "write": true, 00:18:33.815 "unmap": true, 00:18:33.815 "flush": true, 00:18:33.815 "reset": true, 00:18:33.815 "nvme_admin": false, 00:18:33.815 "nvme_io": false, 00:18:33.815 "nvme_io_md": false, 00:18:33.815 "write_zeroes": true, 00:18:33.815 "zcopy": true, 00:18:33.815 "get_zone_info": false, 00:18:33.815 "zone_management": false, 00:18:33.815 "zone_append": false, 00:18:33.815 "compare": false, 00:18:33.815 "compare_and_write": false, 00:18:33.815 "abort": true, 00:18:33.815 "seek_hole": false, 00:18:33.815 "seek_data": false, 00:18:33.815 "copy": true, 00:18:33.815 "nvme_iov_md": false 00:18:33.815 }, 00:18:33.815 "memory_domains": [ 00:18:33.815 { 00:18:33.815 "dma_device_id": "system", 00:18:33.815 "dma_device_type": 1 00:18:33.815 }, 00:18:33.815 { 00:18:33.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.815 "dma_device_type": 2 00:18:33.815 } 00:18:33.815 ], 00:18:33.815 "driver_specific": {} 00:18:33.815 } 00:18:33.815 ] 00:18:33.815 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:33.815 07:54:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:33.815 07:54:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:33.815 07:54:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:34.076 BaseBdev3 00:18:34.076 07:54:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:34.076 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:34.076 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:34.076 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:34.076 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:34.076 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:34.076 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:34.336 07:54:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:34.336 [ 00:18:34.336 { 00:18:34.336 "name": "BaseBdev3", 00:18:34.336 "aliases": [ 00:18:34.336 "0d12f6b3-aaae-4691-b185-33b33e2d31fc" 00:18:34.336 ], 00:18:34.336 "product_name": "Malloc disk", 00:18:34.336 "block_size": 512, 00:18:34.336 "num_blocks": 65536, 00:18:34.336 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:34.336 "assigned_rate_limits": { 00:18:34.336 "rw_ios_per_sec": 0, 00:18:34.336 "rw_mbytes_per_sec": 0, 00:18:34.336 "r_mbytes_per_sec": 0, 00:18:34.336 "w_mbytes_per_sec": 0 00:18:34.336 }, 00:18:34.336 "claimed": false, 00:18:34.336 "zoned": false, 00:18:34.336 "supported_io_types": { 00:18:34.336 "read": true, 00:18:34.336 "write": true, 00:18:34.336 "unmap": true, 00:18:34.336 "flush": true, 00:18:34.336 "reset": true, 00:18:34.336 "nvme_admin": false, 00:18:34.336 "nvme_io": false, 00:18:34.336 "nvme_io_md": false, 00:18:34.336 "write_zeroes": true, 00:18:34.336 "zcopy": true, 00:18:34.336 "get_zone_info": false, 00:18:34.336 "zone_management": false, 00:18:34.336 "zone_append": false, 00:18:34.336 "compare": false, 00:18:34.336 "compare_and_write": false, 00:18:34.336 "abort": true, 00:18:34.336 "seek_hole": false, 00:18:34.336 "seek_data": false, 00:18:34.336 "copy": true, 00:18:34.336 "nvme_iov_md": false 00:18:34.336 }, 00:18:34.336 "memory_domains": [ 00:18:34.336 { 00:18:34.336 "dma_device_id": "system", 00:18:34.336 "dma_device_type": 1 00:18:34.336 }, 00:18:34.336 { 00:18:34.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.336 "dma_device_type": 2 00:18:34.336 } 00:18:34.336 ], 00:18:34.336 "driver_specific": {} 00:18:34.336 } 00:18:34.336 ] 00:18:34.336 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:34.336 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:34.336 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:34.336 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:34.597 BaseBdev4 00:18:34.597 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:34.597 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:34.597 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:34.597 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:34.597 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:34.597 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:34.597 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:34.857 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:35.117 [ 00:18:35.117 { 00:18:35.117 "name": "BaseBdev4", 00:18:35.117 "aliases": [ 00:18:35.117 "d3cf3fdd-026b-4fa4-b23e-9142edf87990" 00:18:35.117 ], 00:18:35.117 "product_name": "Malloc disk", 00:18:35.117 "block_size": 512, 00:18:35.117 "num_blocks": 65536, 00:18:35.117 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:35.117 "assigned_rate_limits": { 00:18:35.117 "rw_ios_per_sec": 0, 00:18:35.117 "rw_mbytes_per_sec": 0, 00:18:35.117 "r_mbytes_per_sec": 0, 00:18:35.117 "w_mbytes_per_sec": 0 00:18:35.117 }, 00:18:35.117 "claimed": false, 00:18:35.117 "zoned": false, 00:18:35.117 "supported_io_types": { 00:18:35.117 "read": true, 00:18:35.117 "write": true, 00:18:35.117 "unmap": true, 00:18:35.117 "flush": true, 00:18:35.117 "reset": true, 00:18:35.117 "nvme_admin": false, 00:18:35.117 "nvme_io": false, 00:18:35.117 "nvme_io_md": false, 00:18:35.117 "write_zeroes": true, 00:18:35.117 "zcopy": true, 00:18:35.117 "get_zone_info": false, 00:18:35.117 "zone_management": false, 00:18:35.117 "zone_append": false, 00:18:35.117 "compare": false, 00:18:35.117 "compare_and_write": false, 00:18:35.117 "abort": true, 00:18:35.117 "seek_hole": false, 00:18:35.117 "seek_data": false, 00:18:35.117 "copy": true, 00:18:35.117 "nvme_iov_md": false 00:18:35.117 }, 00:18:35.117 "memory_domains": [ 00:18:35.117 { 00:18:35.117 "dma_device_id": "system", 00:18:35.117 "dma_device_type": 1 00:18:35.117 }, 00:18:35.117 { 00:18:35.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.117 "dma_device_type": 2 00:18:35.117 } 00:18:35.117 ], 00:18:35.117 "driver_specific": {} 00:18:35.117 } 00:18:35.117 ] 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:35.117 [2024-07-15 07:54:19.807202] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:35.117 [2024-07-15 07:54:19.807230] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:35.117 [2024-07-15 07:54:19.807243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:35.117 [2024-07-15 07:54:19.808307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:35.117 [2024-07-15 07:54:19.808338] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.117 07:54:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:35.378 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:35.378 "name": "Existed_Raid", 00:18:35.378 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:35.378 "strip_size_kb": 64, 00:18:35.378 "state": "configuring", 00:18:35.378 "raid_level": "concat", 00:18:35.378 "superblock": true, 00:18:35.378 "num_base_bdevs": 4, 00:18:35.378 "num_base_bdevs_discovered": 3, 00:18:35.378 "num_base_bdevs_operational": 4, 00:18:35.378 "base_bdevs_list": [ 00:18:35.378 { 00:18:35.378 "name": "BaseBdev1", 00:18:35.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.378 "is_configured": false, 00:18:35.378 "data_offset": 0, 00:18:35.378 "data_size": 0 00:18:35.378 }, 00:18:35.378 { 00:18:35.378 "name": "BaseBdev2", 00:18:35.378 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:35.378 "is_configured": true, 00:18:35.378 "data_offset": 2048, 00:18:35.378 "data_size": 63488 00:18:35.378 }, 00:18:35.378 { 00:18:35.378 "name": "BaseBdev3", 00:18:35.378 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:35.378 "is_configured": true, 00:18:35.378 "data_offset": 2048, 00:18:35.378 "data_size": 63488 00:18:35.378 }, 00:18:35.378 { 00:18:35.378 "name": "BaseBdev4", 00:18:35.378 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:35.378 "is_configured": true, 00:18:35.378 "data_offset": 2048, 00:18:35.378 "data_size": 63488 00:18:35.378 } 00:18:35.378 ] 00:18:35.378 }' 00:18:35.378 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:35.378 07:54:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:35.949 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:36.210 [2024-07-15 07:54:20.761600] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.210 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:36.489 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.489 "name": "Existed_Raid", 00:18:36.489 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:36.489 "strip_size_kb": 64, 00:18:36.489 "state": "configuring", 00:18:36.489 "raid_level": "concat", 00:18:36.489 "superblock": true, 00:18:36.489 "num_base_bdevs": 4, 00:18:36.489 "num_base_bdevs_discovered": 2, 00:18:36.489 "num_base_bdevs_operational": 4, 00:18:36.489 "base_bdevs_list": [ 00:18:36.489 { 00:18:36.489 "name": "BaseBdev1", 00:18:36.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:36.489 "is_configured": false, 00:18:36.489 "data_offset": 0, 00:18:36.489 "data_size": 0 00:18:36.489 }, 00:18:36.489 { 00:18:36.489 "name": null, 00:18:36.489 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:36.489 "is_configured": false, 00:18:36.489 "data_offset": 2048, 00:18:36.489 "data_size": 63488 00:18:36.489 }, 00:18:36.489 { 00:18:36.489 "name": "BaseBdev3", 00:18:36.489 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:36.489 "is_configured": true, 00:18:36.489 "data_offset": 2048, 00:18:36.489 "data_size": 63488 00:18:36.489 }, 00:18:36.489 { 00:18:36.489 "name": "BaseBdev4", 00:18:36.489 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:36.489 "is_configured": true, 00:18:36.489 "data_offset": 2048, 00:18:36.489 "data_size": 63488 00:18:36.489 } 00:18:36.489 ] 00:18:36.489 }' 00:18:36.489 07:54:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.489 07:54:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:36.793 07:54:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.793 07:54:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:37.053 07:54:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:37.053 07:54:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:37.313 [2024-07-15 07:54:21.909498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:37.313 BaseBdev1 00:18:37.313 07:54:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:37.313 07:54:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:37.313 07:54:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:37.313 07:54:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:37.313 07:54:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:37.313 07:54:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:37.313 07:54:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:37.573 07:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:37.573 [ 00:18:37.573 { 00:18:37.573 "name": "BaseBdev1", 00:18:37.573 "aliases": [ 00:18:37.573 "9508a0a3-c838-48b7-b06f-2ea92a1493ec" 00:18:37.573 ], 00:18:37.573 "product_name": "Malloc disk", 00:18:37.573 "block_size": 512, 00:18:37.573 "num_blocks": 65536, 00:18:37.573 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:37.573 "assigned_rate_limits": { 00:18:37.573 "rw_ios_per_sec": 0, 00:18:37.573 "rw_mbytes_per_sec": 0, 00:18:37.573 "r_mbytes_per_sec": 0, 00:18:37.573 "w_mbytes_per_sec": 0 00:18:37.573 }, 00:18:37.573 "claimed": true, 00:18:37.573 "claim_type": "exclusive_write", 00:18:37.573 "zoned": false, 00:18:37.573 "supported_io_types": { 00:18:37.573 "read": true, 00:18:37.573 "write": true, 00:18:37.573 "unmap": true, 00:18:37.573 "flush": true, 00:18:37.574 "reset": true, 00:18:37.574 "nvme_admin": false, 00:18:37.574 "nvme_io": false, 00:18:37.574 "nvme_io_md": false, 00:18:37.574 "write_zeroes": true, 00:18:37.574 "zcopy": true, 00:18:37.574 "get_zone_info": false, 00:18:37.574 "zone_management": false, 00:18:37.574 "zone_append": false, 00:18:37.574 "compare": false, 00:18:37.574 "compare_and_write": false, 00:18:37.574 "abort": true, 00:18:37.574 "seek_hole": false, 00:18:37.574 "seek_data": false, 00:18:37.574 "copy": true, 00:18:37.574 "nvme_iov_md": false 00:18:37.574 }, 00:18:37.574 "memory_domains": [ 00:18:37.574 { 00:18:37.574 "dma_device_id": "system", 00:18:37.574 "dma_device_type": 1 00:18:37.574 }, 00:18:37.574 { 00:18:37.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.574 "dma_device_type": 2 00:18:37.574 } 00:18:37.574 ], 00:18:37.574 "driver_specific": {} 00:18:37.574 } 00:18:37.574 ] 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.574 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:37.835 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.835 "name": "Existed_Raid", 00:18:37.835 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:37.835 "strip_size_kb": 64, 00:18:37.835 "state": "configuring", 00:18:37.835 "raid_level": "concat", 00:18:37.835 "superblock": true, 00:18:37.835 "num_base_bdevs": 4, 00:18:37.835 "num_base_bdevs_discovered": 3, 00:18:37.835 "num_base_bdevs_operational": 4, 00:18:37.835 "base_bdevs_list": [ 00:18:37.835 { 00:18:37.835 "name": "BaseBdev1", 00:18:37.835 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:37.835 "is_configured": true, 00:18:37.835 "data_offset": 2048, 00:18:37.835 "data_size": 63488 00:18:37.835 }, 00:18:37.835 { 00:18:37.835 "name": null, 00:18:37.835 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:37.835 "is_configured": false, 00:18:37.835 "data_offset": 2048, 00:18:37.835 "data_size": 63488 00:18:37.835 }, 00:18:37.835 { 00:18:37.835 "name": "BaseBdev3", 00:18:37.835 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:37.835 "is_configured": true, 00:18:37.835 "data_offset": 2048, 00:18:37.835 "data_size": 63488 00:18:37.835 }, 00:18:37.835 { 00:18:37.835 "name": "BaseBdev4", 00:18:37.835 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:37.835 "is_configured": true, 00:18:37.835 "data_offset": 2048, 00:18:37.835 "data_size": 63488 00:18:37.835 } 00:18:37.835 ] 00:18:37.835 }' 00:18:37.835 07:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.835 07:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:38.405 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.405 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:38.665 [2024-07-15 07:54:23.393269] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.665 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:38.924 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:38.924 "name": "Existed_Raid", 00:18:38.924 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:38.924 "strip_size_kb": 64, 00:18:38.924 "state": "configuring", 00:18:38.924 "raid_level": "concat", 00:18:38.924 "superblock": true, 00:18:38.924 "num_base_bdevs": 4, 00:18:38.924 "num_base_bdevs_discovered": 2, 00:18:38.924 "num_base_bdevs_operational": 4, 00:18:38.924 "base_bdevs_list": [ 00:18:38.924 { 00:18:38.924 "name": "BaseBdev1", 00:18:38.924 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:38.924 "is_configured": true, 00:18:38.924 "data_offset": 2048, 00:18:38.924 "data_size": 63488 00:18:38.924 }, 00:18:38.924 { 00:18:38.924 "name": null, 00:18:38.924 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:38.924 "is_configured": false, 00:18:38.924 "data_offset": 2048, 00:18:38.924 "data_size": 63488 00:18:38.924 }, 00:18:38.924 { 00:18:38.924 "name": null, 00:18:38.924 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:38.924 "is_configured": false, 00:18:38.924 "data_offset": 2048, 00:18:38.924 "data_size": 63488 00:18:38.924 }, 00:18:38.924 { 00:18:38.924 "name": "BaseBdev4", 00:18:38.924 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:38.924 "is_configured": true, 00:18:38.924 "data_offset": 2048, 00:18:38.924 "data_size": 63488 00:18:38.924 } 00:18:38.924 ] 00:18:38.924 }' 00:18:38.924 07:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:38.924 07:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:39.491 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.491 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:39.750 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:39.750 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:40.009 [2024-07-15 07:54:24.564280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:40.009 "name": "Existed_Raid", 00:18:40.009 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:40.009 "strip_size_kb": 64, 00:18:40.009 "state": "configuring", 00:18:40.009 "raid_level": "concat", 00:18:40.009 "superblock": true, 00:18:40.009 "num_base_bdevs": 4, 00:18:40.009 "num_base_bdevs_discovered": 3, 00:18:40.009 "num_base_bdevs_operational": 4, 00:18:40.009 "base_bdevs_list": [ 00:18:40.009 { 00:18:40.009 "name": "BaseBdev1", 00:18:40.009 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:40.009 "is_configured": true, 00:18:40.009 "data_offset": 2048, 00:18:40.009 "data_size": 63488 00:18:40.009 }, 00:18:40.009 { 00:18:40.009 "name": null, 00:18:40.009 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:40.009 "is_configured": false, 00:18:40.009 "data_offset": 2048, 00:18:40.009 "data_size": 63488 00:18:40.009 }, 00:18:40.009 { 00:18:40.009 "name": "BaseBdev3", 00:18:40.009 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:40.009 "is_configured": true, 00:18:40.009 "data_offset": 2048, 00:18:40.009 "data_size": 63488 00:18:40.009 }, 00:18:40.009 { 00:18:40.009 "name": "BaseBdev4", 00:18:40.009 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:40.009 "is_configured": true, 00:18:40.009 "data_offset": 2048, 00:18:40.009 "data_size": 63488 00:18:40.009 } 00:18:40.009 ] 00:18:40.009 }' 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:40.009 07:54:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:40.578 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.578 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:40.837 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:40.837 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:41.096 [2024-07-15 07:54:25.671086] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.096 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.356 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.356 "name": "Existed_Raid", 00:18:41.356 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:41.356 "strip_size_kb": 64, 00:18:41.356 "state": "configuring", 00:18:41.356 "raid_level": "concat", 00:18:41.356 "superblock": true, 00:18:41.356 "num_base_bdevs": 4, 00:18:41.356 "num_base_bdevs_discovered": 2, 00:18:41.356 "num_base_bdevs_operational": 4, 00:18:41.356 "base_bdevs_list": [ 00:18:41.356 { 00:18:41.356 "name": null, 00:18:41.356 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:41.356 "is_configured": false, 00:18:41.356 "data_offset": 2048, 00:18:41.356 "data_size": 63488 00:18:41.356 }, 00:18:41.356 { 00:18:41.356 "name": null, 00:18:41.356 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:41.356 "is_configured": false, 00:18:41.356 "data_offset": 2048, 00:18:41.356 "data_size": 63488 00:18:41.356 }, 00:18:41.356 { 00:18:41.356 "name": "BaseBdev3", 00:18:41.356 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:41.356 "is_configured": true, 00:18:41.356 "data_offset": 2048, 00:18:41.356 "data_size": 63488 00:18:41.356 }, 00:18:41.356 { 00:18:41.356 "name": "BaseBdev4", 00:18:41.356 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:41.356 "is_configured": true, 00:18:41.356 "data_offset": 2048, 00:18:41.356 "data_size": 63488 00:18:41.356 } 00:18:41.356 ] 00:18:41.356 }' 00:18:41.356 07:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.356 07:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:41.925 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.925 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:41.925 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:41.925 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:42.185 [2024-07-15 07:54:26.823816] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.185 07:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:42.445 07:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:42.445 "name": "Existed_Raid", 00:18:42.445 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:42.445 "strip_size_kb": 64, 00:18:42.445 "state": "configuring", 00:18:42.445 "raid_level": "concat", 00:18:42.445 "superblock": true, 00:18:42.445 "num_base_bdevs": 4, 00:18:42.445 "num_base_bdevs_discovered": 3, 00:18:42.445 "num_base_bdevs_operational": 4, 00:18:42.445 "base_bdevs_list": [ 00:18:42.445 { 00:18:42.445 "name": null, 00:18:42.445 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:42.445 "is_configured": false, 00:18:42.445 "data_offset": 2048, 00:18:42.445 "data_size": 63488 00:18:42.445 }, 00:18:42.445 { 00:18:42.445 "name": "BaseBdev2", 00:18:42.445 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:42.445 "is_configured": true, 00:18:42.445 "data_offset": 2048, 00:18:42.445 "data_size": 63488 00:18:42.445 }, 00:18:42.445 { 00:18:42.445 "name": "BaseBdev3", 00:18:42.445 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:42.445 "is_configured": true, 00:18:42.445 "data_offset": 2048, 00:18:42.445 "data_size": 63488 00:18:42.445 }, 00:18:42.445 { 00:18:42.445 "name": "BaseBdev4", 00:18:42.445 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:42.445 "is_configured": true, 00:18:42.445 "data_offset": 2048, 00:18:42.445 "data_size": 63488 00:18:42.445 } 00:18:42.445 ] 00:18:42.445 }' 00:18:42.445 07:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:42.445 07:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:43.015 07:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.015 07:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:43.015 07:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:43.015 07:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.015 07:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:43.274 07:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9508a0a3-c838-48b7-b06f-2ea92a1493ec 00:18:43.534 [2024-07-15 07:54:28.119988] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:43.534 [2024-07-15 07:54:28.120105] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21da080 00:18:43.534 [2024-07-15 07:54:28.120113] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:43.534 [2024-07-15 07:54:28.120249] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21cfc30 00:18:43.534 [2024-07-15 07:54:28.120337] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21da080 00:18:43.534 [2024-07-15 07:54:28.120342] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21da080 00:18:43.534 [2024-07-15 07:54:28.120406] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:43.534 NewBaseBdev 00:18:43.534 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:43.534 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:43.534 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:43.534 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:43.534 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:43.534 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:43.534 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:43.795 [ 00:18:43.795 { 00:18:43.795 "name": "NewBaseBdev", 00:18:43.795 "aliases": [ 00:18:43.795 "9508a0a3-c838-48b7-b06f-2ea92a1493ec" 00:18:43.795 ], 00:18:43.795 "product_name": "Malloc disk", 00:18:43.795 "block_size": 512, 00:18:43.795 "num_blocks": 65536, 00:18:43.795 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:43.795 "assigned_rate_limits": { 00:18:43.795 "rw_ios_per_sec": 0, 00:18:43.795 "rw_mbytes_per_sec": 0, 00:18:43.795 "r_mbytes_per_sec": 0, 00:18:43.795 "w_mbytes_per_sec": 0 00:18:43.795 }, 00:18:43.795 "claimed": true, 00:18:43.795 "claim_type": "exclusive_write", 00:18:43.795 "zoned": false, 00:18:43.795 "supported_io_types": { 00:18:43.795 "read": true, 00:18:43.795 "write": true, 00:18:43.795 "unmap": true, 00:18:43.795 "flush": true, 00:18:43.795 "reset": true, 00:18:43.795 "nvme_admin": false, 00:18:43.795 "nvme_io": false, 00:18:43.795 "nvme_io_md": false, 00:18:43.795 "write_zeroes": true, 00:18:43.795 "zcopy": true, 00:18:43.795 "get_zone_info": false, 00:18:43.795 "zone_management": false, 00:18:43.795 "zone_append": false, 00:18:43.795 "compare": false, 00:18:43.795 "compare_and_write": false, 00:18:43.795 "abort": true, 00:18:43.795 "seek_hole": false, 00:18:43.795 "seek_data": false, 00:18:43.795 "copy": true, 00:18:43.795 "nvme_iov_md": false 00:18:43.795 }, 00:18:43.795 "memory_domains": [ 00:18:43.795 { 00:18:43.795 "dma_device_id": "system", 00:18:43.795 "dma_device_type": 1 00:18:43.795 }, 00:18:43.795 { 00:18:43.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.795 "dma_device_type": 2 00:18:43.795 } 00:18:43.795 ], 00:18:43.795 "driver_specific": {} 00:18:43.795 } 00:18:43.795 ] 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.795 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.055 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.055 "name": "Existed_Raid", 00:18:44.055 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:44.055 "strip_size_kb": 64, 00:18:44.055 "state": "online", 00:18:44.055 "raid_level": "concat", 00:18:44.055 "superblock": true, 00:18:44.055 "num_base_bdevs": 4, 00:18:44.055 "num_base_bdevs_discovered": 4, 00:18:44.055 "num_base_bdevs_operational": 4, 00:18:44.055 "base_bdevs_list": [ 00:18:44.055 { 00:18:44.055 "name": "NewBaseBdev", 00:18:44.055 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:44.055 "is_configured": true, 00:18:44.055 "data_offset": 2048, 00:18:44.055 "data_size": 63488 00:18:44.055 }, 00:18:44.055 { 00:18:44.055 "name": "BaseBdev2", 00:18:44.055 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:44.055 "is_configured": true, 00:18:44.055 "data_offset": 2048, 00:18:44.055 "data_size": 63488 00:18:44.055 }, 00:18:44.055 { 00:18:44.055 "name": "BaseBdev3", 00:18:44.055 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:44.055 "is_configured": true, 00:18:44.055 "data_offset": 2048, 00:18:44.055 "data_size": 63488 00:18:44.055 }, 00:18:44.055 { 00:18:44.055 "name": "BaseBdev4", 00:18:44.055 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:44.055 "is_configured": true, 00:18:44.055 "data_offset": 2048, 00:18:44.055 "data_size": 63488 00:18:44.055 } 00:18:44.055 ] 00:18:44.055 }' 00:18:44.055 07:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.055 07:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:44.624 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:44.624 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:44.624 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:44.624 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:44.624 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:44.624 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:44.624 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:44.624 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:44.624 [2024-07-15 07:54:29.367385] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:44.884 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:44.884 "name": "Existed_Raid", 00:18:44.884 "aliases": [ 00:18:44.884 "8a361efc-d387-4de0-bb4f-75ed20fe6231" 00:18:44.884 ], 00:18:44.884 "product_name": "Raid Volume", 00:18:44.884 "block_size": 512, 00:18:44.884 "num_blocks": 253952, 00:18:44.884 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:44.884 "assigned_rate_limits": { 00:18:44.884 "rw_ios_per_sec": 0, 00:18:44.884 "rw_mbytes_per_sec": 0, 00:18:44.884 "r_mbytes_per_sec": 0, 00:18:44.884 "w_mbytes_per_sec": 0 00:18:44.884 }, 00:18:44.884 "claimed": false, 00:18:44.884 "zoned": false, 00:18:44.884 "supported_io_types": { 00:18:44.884 "read": true, 00:18:44.884 "write": true, 00:18:44.884 "unmap": true, 00:18:44.884 "flush": true, 00:18:44.884 "reset": true, 00:18:44.884 "nvme_admin": false, 00:18:44.884 "nvme_io": false, 00:18:44.884 "nvme_io_md": false, 00:18:44.884 "write_zeroes": true, 00:18:44.884 "zcopy": false, 00:18:44.884 "get_zone_info": false, 00:18:44.884 "zone_management": false, 00:18:44.884 "zone_append": false, 00:18:44.884 "compare": false, 00:18:44.884 "compare_and_write": false, 00:18:44.884 "abort": false, 00:18:44.884 "seek_hole": false, 00:18:44.884 "seek_data": false, 00:18:44.884 "copy": false, 00:18:44.884 "nvme_iov_md": false 00:18:44.884 }, 00:18:44.884 "memory_domains": [ 00:18:44.884 { 00:18:44.884 "dma_device_id": "system", 00:18:44.884 "dma_device_type": 1 00:18:44.884 }, 00:18:44.884 { 00:18:44.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.884 "dma_device_type": 2 00:18:44.884 }, 00:18:44.884 { 00:18:44.884 "dma_device_id": "system", 00:18:44.884 "dma_device_type": 1 00:18:44.884 }, 00:18:44.884 { 00:18:44.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.884 "dma_device_type": 2 00:18:44.884 }, 00:18:44.884 { 00:18:44.884 "dma_device_id": "system", 00:18:44.884 "dma_device_type": 1 00:18:44.884 }, 00:18:44.884 { 00:18:44.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.884 "dma_device_type": 2 00:18:44.884 }, 00:18:44.884 { 00:18:44.884 "dma_device_id": "system", 00:18:44.884 "dma_device_type": 1 00:18:44.884 }, 00:18:44.884 { 00:18:44.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.884 "dma_device_type": 2 00:18:44.884 } 00:18:44.884 ], 00:18:44.884 "driver_specific": { 00:18:44.884 "raid": { 00:18:44.884 "uuid": "8a361efc-d387-4de0-bb4f-75ed20fe6231", 00:18:44.884 "strip_size_kb": 64, 00:18:44.884 "state": "online", 00:18:44.884 "raid_level": "concat", 00:18:44.884 "superblock": true, 00:18:44.884 "num_base_bdevs": 4, 00:18:44.884 "num_base_bdevs_discovered": 4, 00:18:44.884 "num_base_bdevs_operational": 4, 00:18:44.884 "base_bdevs_list": [ 00:18:44.884 { 00:18:44.884 "name": "NewBaseBdev", 00:18:44.884 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:44.884 "is_configured": true, 00:18:44.884 "data_offset": 2048, 00:18:44.884 "data_size": 63488 00:18:44.884 }, 00:18:44.884 { 00:18:44.884 "name": "BaseBdev2", 00:18:44.884 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:44.884 "is_configured": true, 00:18:44.885 "data_offset": 2048, 00:18:44.885 "data_size": 63488 00:18:44.885 }, 00:18:44.885 { 00:18:44.885 "name": "BaseBdev3", 00:18:44.885 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:44.885 "is_configured": true, 00:18:44.885 "data_offset": 2048, 00:18:44.885 "data_size": 63488 00:18:44.885 }, 00:18:44.885 { 00:18:44.885 "name": "BaseBdev4", 00:18:44.885 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:44.885 "is_configured": true, 00:18:44.885 "data_offset": 2048, 00:18:44.885 "data_size": 63488 00:18:44.885 } 00:18:44.885 ] 00:18:44.885 } 00:18:44.885 } 00:18:44.885 }' 00:18:44.885 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:44.885 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:44.885 BaseBdev2 00:18:44.885 BaseBdev3 00:18:44.885 BaseBdev4' 00:18:44.885 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:44.885 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:44.885 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:44.885 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:44.885 "name": "NewBaseBdev", 00:18:44.885 "aliases": [ 00:18:44.885 "9508a0a3-c838-48b7-b06f-2ea92a1493ec" 00:18:44.885 ], 00:18:44.885 "product_name": "Malloc disk", 00:18:44.885 "block_size": 512, 00:18:44.885 "num_blocks": 65536, 00:18:44.885 "uuid": "9508a0a3-c838-48b7-b06f-2ea92a1493ec", 00:18:44.885 "assigned_rate_limits": { 00:18:44.885 "rw_ios_per_sec": 0, 00:18:44.885 "rw_mbytes_per_sec": 0, 00:18:44.885 "r_mbytes_per_sec": 0, 00:18:44.885 "w_mbytes_per_sec": 0 00:18:44.885 }, 00:18:44.885 "claimed": true, 00:18:44.885 "claim_type": "exclusive_write", 00:18:44.885 "zoned": false, 00:18:44.885 "supported_io_types": { 00:18:44.885 "read": true, 00:18:44.885 "write": true, 00:18:44.885 "unmap": true, 00:18:44.885 "flush": true, 00:18:44.885 "reset": true, 00:18:44.885 "nvme_admin": false, 00:18:44.885 "nvme_io": false, 00:18:44.885 "nvme_io_md": false, 00:18:44.885 "write_zeroes": true, 00:18:44.885 "zcopy": true, 00:18:44.885 "get_zone_info": false, 00:18:44.885 "zone_management": false, 00:18:44.885 "zone_append": false, 00:18:44.885 "compare": false, 00:18:44.885 "compare_and_write": false, 00:18:44.885 "abort": true, 00:18:44.885 "seek_hole": false, 00:18:44.885 "seek_data": false, 00:18:44.885 "copy": true, 00:18:44.885 "nvme_iov_md": false 00:18:44.885 }, 00:18:44.885 "memory_domains": [ 00:18:44.885 { 00:18:44.885 "dma_device_id": "system", 00:18:44.885 "dma_device_type": 1 00:18:44.885 }, 00:18:44.885 { 00:18:44.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.885 "dma_device_type": 2 00:18:44.885 } 00:18:44.885 ], 00:18:44.885 "driver_specific": {} 00:18:44.885 }' 00:18:44.885 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:45.144 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.403 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.403 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:45.403 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.403 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:45.403 07:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:45.403 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:45.403 "name": "BaseBdev2", 00:18:45.403 "aliases": [ 00:18:45.403 "e1b4afce-f675-4e16-a36e-92ef910fc44a" 00:18:45.403 ], 00:18:45.403 "product_name": "Malloc disk", 00:18:45.403 "block_size": 512, 00:18:45.403 "num_blocks": 65536, 00:18:45.403 "uuid": "e1b4afce-f675-4e16-a36e-92ef910fc44a", 00:18:45.403 "assigned_rate_limits": { 00:18:45.403 "rw_ios_per_sec": 0, 00:18:45.403 "rw_mbytes_per_sec": 0, 00:18:45.403 "r_mbytes_per_sec": 0, 00:18:45.403 "w_mbytes_per_sec": 0 00:18:45.403 }, 00:18:45.403 "claimed": true, 00:18:45.403 "claim_type": "exclusive_write", 00:18:45.403 "zoned": false, 00:18:45.403 "supported_io_types": { 00:18:45.403 "read": true, 00:18:45.403 "write": true, 00:18:45.403 "unmap": true, 00:18:45.403 "flush": true, 00:18:45.403 "reset": true, 00:18:45.403 "nvme_admin": false, 00:18:45.403 "nvme_io": false, 00:18:45.403 "nvme_io_md": false, 00:18:45.403 "write_zeroes": true, 00:18:45.403 "zcopy": true, 00:18:45.403 "get_zone_info": false, 00:18:45.403 "zone_management": false, 00:18:45.403 "zone_append": false, 00:18:45.403 "compare": false, 00:18:45.403 "compare_and_write": false, 00:18:45.403 "abort": true, 00:18:45.403 "seek_hole": false, 00:18:45.403 "seek_data": false, 00:18:45.403 "copy": true, 00:18:45.403 "nvme_iov_md": false 00:18:45.403 }, 00:18:45.403 "memory_domains": [ 00:18:45.403 { 00:18:45.403 "dma_device_id": "system", 00:18:45.403 "dma_device_type": 1 00:18:45.403 }, 00:18:45.403 { 00:18:45.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.403 "dma_device_type": 2 00:18:45.403 } 00:18:45.403 ], 00:18:45.403 "driver_specific": {} 00:18:45.403 }' 00:18:45.403 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.662 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.662 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:45.662 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.662 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.662 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:45.662 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.662 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.921 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:45.921 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.921 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.921 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:45.921 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.921 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:45.921 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.181 "name": "BaseBdev3", 00:18:46.181 "aliases": [ 00:18:46.181 "0d12f6b3-aaae-4691-b185-33b33e2d31fc" 00:18:46.181 ], 00:18:46.181 "product_name": "Malloc disk", 00:18:46.181 "block_size": 512, 00:18:46.181 "num_blocks": 65536, 00:18:46.181 "uuid": "0d12f6b3-aaae-4691-b185-33b33e2d31fc", 00:18:46.181 "assigned_rate_limits": { 00:18:46.181 "rw_ios_per_sec": 0, 00:18:46.181 "rw_mbytes_per_sec": 0, 00:18:46.181 "r_mbytes_per_sec": 0, 00:18:46.181 "w_mbytes_per_sec": 0 00:18:46.181 }, 00:18:46.181 "claimed": true, 00:18:46.181 "claim_type": "exclusive_write", 00:18:46.181 "zoned": false, 00:18:46.181 "supported_io_types": { 00:18:46.181 "read": true, 00:18:46.181 "write": true, 00:18:46.181 "unmap": true, 00:18:46.181 "flush": true, 00:18:46.181 "reset": true, 00:18:46.181 "nvme_admin": false, 00:18:46.181 "nvme_io": false, 00:18:46.181 "nvme_io_md": false, 00:18:46.181 "write_zeroes": true, 00:18:46.181 "zcopy": true, 00:18:46.181 "get_zone_info": false, 00:18:46.181 "zone_management": false, 00:18:46.181 "zone_append": false, 00:18:46.181 "compare": false, 00:18:46.181 "compare_and_write": false, 00:18:46.181 "abort": true, 00:18:46.181 "seek_hole": false, 00:18:46.181 "seek_data": false, 00:18:46.181 "copy": true, 00:18:46.181 "nvme_iov_md": false 00:18:46.181 }, 00:18:46.181 "memory_domains": [ 00:18:46.181 { 00:18:46.181 "dma_device_id": "system", 00:18:46.181 "dma_device_type": 1 00:18:46.181 }, 00:18:46.181 { 00:18:46.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.181 "dma_device_type": 2 00:18:46.181 } 00:18:46.181 ], 00:18:46.181 "driver_specific": {} 00:18:46.181 }' 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.181 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.441 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.441 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.441 07:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.441 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:46.441 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:46.441 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:46.441 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.700 "name": "BaseBdev4", 00:18:46.700 "aliases": [ 00:18:46.700 "d3cf3fdd-026b-4fa4-b23e-9142edf87990" 00:18:46.700 ], 00:18:46.700 "product_name": "Malloc disk", 00:18:46.700 "block_size": 512, 00:18:46.700 "num_blocks": 65536, 00:18:46.700 "uuid": "d3cf3fdd-026b-4fa4-b23e-9142edf87990", 00:18:46.700 "assigned_rate_limits": { 00:18:46.700 "rw_ios_per_sec": 0, 00:18:46.700 "rw_mbytes_per_sec": 0, 00:18:46.700 "r_mbytes_per_sec": 0, 00:18:46.700 "w_mbytes_per_sec": 0 00:18:46.700 }, 00:18:46.700 "claimed": true, 00:18:46.700 "claim_type": "exclusive_write", 00:18:46.700 "zoned": false, 00:18:46.700 "supported_io_types": { 00:18:46.700 "read": true, 00:18:46.700 "write": true, 00:18:46.700 "unmap": true, 00:18:46.700 "flush": true, 00:18:46.700 "reset": true, 00:18:46.700 "nvme_admin": false, 00:18:46.700 "nvme_io": false, 00:18:46.700 "nvme_io_md": false, 00:18:46.700 "write_zeroes": true, 00:18:46.700 "zcopy": true, 00:18:46.700 "get_zone_info": false, 00:18:46.700 "zone_management": false, 00:18:46.700 "zone_append": false, 00:18:46.700 "compare": false, 00:18:46.700 "compare_and_write": false, 00:18:46.700 "abort": true, 00:18:46.700 "seek_hole": false, 00:18:46.700 "seek_data": false, 00:18:46.700 "copy": true, 00:18:46.700 "nvme_iov_md": false 00:18:46.700 }, 00:18:46.700 "memory_domains": [ 00:18:46.700 { 00:18:46.700 "dma_device_id": "system", 00:18:46.700 "dma_device_type": 1 00:18:46.700 }, 00:18:46.700 { 00:18:46.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.700 "dma_device_type": 2 00:18:46.700 } 00:18:46.700 ], 00:18:46.700 "driver_specific": {} 00:18:46.700 }' 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.700 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.960 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.960 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.960 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.960 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:46.960 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:47.220 [2024-07-15 07:54:31.749157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:47.220 [2024-07-15 07:54:31.749173] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:47.220 [2024-07-15 07:54:31.749209] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:47.220 [2024-07-15 07:54:31.749251] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:47.220 [2024-07-15 07:54:31.749257] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21da080 name Existed_Raid, state offline 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1675897 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1675897 ']' 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1675897 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1675897 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1675897' 00:18:47.220 killing process with pid 1675897 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1675897 00:18:47.220 [2024-07-15 07:54:31.820539] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1675897 00:18:47.220 [2024-07-15 07:54:31.840843] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:47.220 00:18:47.220 real 0m28.387s 00:18:47.220 user 0m53.736s 00:18:47.220 sys 0m4.152s 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:47.220 07:54:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:47.220 ************************************ 00:18:47.220 END TEST raid_state_function_test_sb 00:18:47.220 ************************************ 00:18:47.481 07:54:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:47.481 07:54:31 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:18:47.481 07:54:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:47.481 07:54:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:47.481 07:54:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:47.481 ************************************ 00:18:47.481 START TEST raid_superblock_test 00:18:47.481 ************************************ 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1681807 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1681807 /var/tmp/spdk-raid.sock 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1681807 ']' 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:47.481 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:47.481 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.481 [2024-07-15 07:54:32.089897] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:18:47.481 [2024-07-15 07:54:32.089949] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1681807 ] 00:18:47.481 [2024-07-15 07:54:32.181818] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:47.741 [2024-07-15 07:54:32.258808] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:47.741 [2024-07-15 07:54:32.299758] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:47.741 [2024-07-15 07:54:32.299784] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:48.310 07:54:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:48.572 malloc1 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:48.572 [2024-07-15 07:54:33.294452] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:48.572 [2024-07-15 07:54:33.294488] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:48.572 [2024-07-15 07:54:33.294499] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1447a20 00:18:48.572 [2024-07-15 07:54:33.294506] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:48.572 [2024-07-15 07:54:33.295784] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:48.572 [2024-07-15 07:54:33.295802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:48.572 pt1 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:48.572 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:48.833 malloc2 00:18:48.833 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:49.093 [2024-07-15 07:54:33.665237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:49.093 [2024-07-15 07:54:33.665263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.093 [2024-07-15 07:54:33.665273] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1448040 00:18:49.093 [2024-07-15 07:54:33.665279] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.093 [2024-07-15 07:54:33.666412] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.093 [2024-07-15 07:54:33.666430] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:49.093 pt2 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:49.093 07:54:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:49.662 malloc3 00:18:49.662 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:49.662 [2024-07-15 07:54:34.404819] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:49.662 [2024-07-15 07:54:34.404850] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.662 [2024-07-15 07:54:34.404858] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1448540 00:18:49.662 [2024-07-15 07:54:34.404864] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.662 [2024-07-15 07:54:34.406015] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.662 [2024-07-15 07:54:34.406033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:49.662 pt3 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:49.922 malloc4 00:18:49.922 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:50.182 [2024-07-15 07:54:34.775404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:50.182 [2024-07-15 07:54:34.775427] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.182 [2024-07-15 07:54:34.775436] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15f5d60 00:18:50.182 [2024-07-15 07:54:34.775442] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.182 [2024-07-15 07:54:34.776569] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.182 [2024-07-15 07:54:34.776586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:50.182 pt4 00:18:50.182 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:50.182 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:50.182 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:50.443 [2024-07-15 07:54:34.959890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:50.443 [2024-07-15 07:54:34.960851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:50.443 [2024-07-15 07:54:34.960890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:50.443 [2024-07-15 07:54:34.960923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:50.443 [2024-07-15 07:54:34.961054] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15f2e20 00:18:50.443 [2024-07-15 07:54:34.961061] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:50.443 [2024-07-15 07:54:34.961203] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1449000 00:18:50.443 [2024-07-15 07:54:34.961311] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15f2e20 00:18:50.443 [2024-07-15 07:54:34.961316] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15f2e20 00:18:50.443 [2024-07-15 07:54:34.961383] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.443 07:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:50.443 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.443 "name": "raid_bdev1", 00:18:50.443 "uuid": "d99ce06e-5f81-41e5-9412-94d6bdd8eca1", 00:18:50.443 "strip_size_kb": 64, 00:18:50.443 "state": "online", 00:18:50.443 "raid_level": "concat", 00:18:50.443 "superblock": true, 00:18:50.443 "num_base_bdevs": 4, 00:18:50.443 "num_base_bdevs_discovered": 4, 00:18:50.443 "num_base_bdevs_operational": 4, 00:18:50.443 "base_bdevs_list": [ 00:18:50.443 { 00:18:50.443 "name": "pt1", 00:18:50.443 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:50.443 "is_configured": true, 00:18:50.443 "data_offset": 2048, 00:18:50.443 "data_size": 63488 00:18:50.443 }, 00:18:50.443 { 00:18:50.443 "name": "pt2", 00:18:50.443 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:50.443 "is_configured": true, 00:18:50.443 "data_offset": 2048, 00:18:50.443 "data_size": 63488 00:18:50.443 }, 00:18:50.443 { 00:18:50.443 "name": "pt3", 00:18:50.443 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:50.443 "is_configured": true, 00:18:50.443 "data_offset": 2048, 00:18:50.443 "data_size": 63488 00:18:50.443 }, 00:18:50.443 { 00:18:50.443 "name": "pt4", 00:18:50.443 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:50.443 "is_configured": true, 00:18:50.443 "data_offset": 2048, 00:18:50.443 "data_size": 63488 00:18:50.443 } 00:18:50.443 ] 00:18:50.443 }' 00:18:50.443 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.443 07:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.057 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:51.057 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:51.057 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:51.057 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:51.057 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:51.057 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:51.057 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:51.057 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:51.317 [2024-07-15 07:54:35.902504] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:51.317 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:51.317 "name": "raid_bdev1", 00:18:51.317 "aliases": [ 00:18:51.317 "d99ce06e-5f81-41e5-9412-94d6bdd8eca1" 00:18:51.317 ], 00:18:51.317 "product_name": "Raid Volume", 00:18:51.317 "block_size": 512, 00:18:51.317 "num_blocks": 253952, 00:18:51.317 "uuid": "d99ce06e-5f81-41e5-9412-94d6bdd8eca1", 00:18:51.317 "assigned_rate_limits": { 00:18:51.317 "rw_ios_per_sec": 0, 00:18:51.317 "rw_mbytes_per_sec": 0, 00:18:51.317 "r_mbytes_per_sec": 0, 00:18:51.317 "w_mbytes_per_sec": 0 00:18:51.317 }, 00:18:51.317 "claimed": false, 00:18:51.317 "zoned": false, 00:18:51.317 "supported_io_types": { 00:18:51.317 "read": true, 00:18:51.317 "write": true, 00:18:51.317 "unmap": true, 00:18:51.317 "flush": true, 00:18:51.317 "reset": true, 00:18:51.317 "nvme_admin": false, 00:18:51.317 "nvme_io": false, 00:18:51.317 "nvme_io_md": false, 00:18:51.317 "write_zeroes": true, 00:18:51.317 "zcopy": false, 00:18:51.317 "get_zone_info": false, 00:18:51.317 "zone_management": false, 00:18:51.317 "zone_append": false, 00:18:51.317 "compare": false, 00:18:51.317 "compare_and_write": false, 00:18:51.317 "abort": false, 00:18:51.317 "seek_hole": false, 00:18:51.317 "seek_data": false, 00:18:51.317 "copy": false, 00:18:51.317 "nvme_iov_md": false 00:18:51.317 }, 00:18:51.317 "memory_domains": [ 00:18:51.317 { 00:18:51.317 "dma_device_id": "system", 00:18:51.317 "dma_device_type": 1 00:18:51.317 }, 00:18:51.317 { 00:18:51.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.317 "dma_device_type": 2 00:18:51.317 }, 00:18:51.317 { 00:18:51.317 "dma_device_id": "system", 00:18:51.317 "dma_device_type": 1 00:18:51.317 }, 00:18:51.317 { 00:18:51.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.317 "dma_device_type": 2 00:18:51.317 }, 00:18:51.317 { 00:18:51.317 "dma_device_id": "system", 00:18:51.317 "dma_device_type": 1 00:18:51.317 }, 00:18:51.317 { 00:18:51.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.317 "dma_device_type": 2 00:18:51.317 }, 00:18:51.317 { 00:18:51.317 "dma_device_id": "system", 00:18:51.317 "dma_device_type": 1 00:18:51.317 }, 00:18:51.317 { 00:18:51.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.317 "dma_device_type": 2 00:18:51.317 } 00:18:51.317 ], 00:18:51.317 "driver_specific": { 00:18:51.317 "raid": { 00:18:51.317 "uuid": "d99ce06e-5f81-41e5-9412-94d6bdd8eca1", 00:18:51.317 "strip_size_kb": 64, 00:18:51.317 "state": "online", 00:18:51.317 "raid_level": "concat", 00:18:51.318 "superblock": true, 00:18:51.318 "num_base_bdevs": 4, 00:18:51.318 "num_base_bdevs_discovered": 4, 00:18:51.318 "num_base_bdevs_operational": 4, 00:18:51.318 "base_bdevs_list": [ 00:18:51.318 { 00:18:51.318 "name": "pt1", 00:18:51.318 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:51.318 "is_configured": true, 00:18:51.318 "data_offset": 2048, 00:18:51.318 "data_size": 63488 00:18:51.318 }, 00:18:51.318 { 00:18:51.318 "name": "pt2", 00:18:51.318 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:51.318 "is_configured": true, 00:18:51.318 "data_offset": 2048, 00:18:51.318 "data_size": 63488 00:18:51.318 }, 00:18:51.318 { 00:18:51.318 "name": "pt3", 00:18:51.318 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:51.318 "is_configured": true, 00:18:51.318 "data_offset": 2048, 00:18:51.318 "data_size": 63488 00:18:51.318 }, 00:18:51.318 { 00:18:51.318 "name": "pt4", 00:18:51.318 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:51.318 "is_configured": true, 00:18:51.318 "data_offset": 2048, 00:18:51.318 "data_size": 63488 00:18:51.318 } 00:18:51.318 ] 00:18:51.318 } 00:18:51.318 } 00:18:51.318 }' 00:18:51.318 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:51.318 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:51.318 pt2 00:18:51.318 pt3 00:18:51.318 pt4' 00:18:51.318 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:51.318 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:51.318 07:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:51.578 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:51.578 "name": "pt1", 00:18:51.578 "aliases": [ 00:18:51.578 "00000000-0000-0000-0000-000000000001" 00:18:51.578 ], 00:18:51.578 "product_name": "passthru", 00:18:51.578 "block_size": 512, 00:18:51.578 "num_blocks": 65536, 00:18:51.578 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:51.578 "assigned_rate_limits": { 00:18:51.578 "rw_ios_per_sec": 0, 00:18:51.578 "rw_mbytes_per_sec": 0, 00:18:51.578 "r_mbytes_per_sec": 0, 00:18:51.578 "w_mbytes_per_sec": 0 00:18:51.578 }, 00:18:51.578 "claimed": true, 00:18:51.578 "claim_type": "exclusive_write", 00:18:51.578 "zoned": false, 00:18:51.578 "supported_io_types": { 00:18:51.578 "read": true, 00:18:51.578 "write": true, 00:18:51.578 "unmap": true, 00:18:51.578 "flush": true, 00:18:51.578 "reset": true, 00:18:51.578 "nvme_admin": false, 00:18:51.578 "nvme_io": false, 00:18:51.578 "nvme_io_md": false, 00:18:51.578 "write_zeroes": true, 00:18:51.578 "zcopy": true, 00:18:51.578 "get_zone_info": false, 00:18:51.578 "zone_management": false, 00:18:51.578 "zone_append": false, 00:18:51.578 "compare": false, 00:18:51.578 "compare_and_write": false, 00:18:51.578 "abort": true, 00:18:51.578 "seek_hole": false, 00:18:51.578 "seek_data": false, 00:18:51.578 "copy": true, 00:18:51.578 "nvme_iov_md": false 00:18:51.578 }, 00:18:51.578 "memory_domains": [ 00:18:51.578 { 00:18:51.578 "dma_device_id": "system", 00:18:51.578 "dma_device_type": 1 00:18:51.578 }, 00:18:51.578 { 00:18:51.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:51.578 "dma_device_type": 2 00:18:51.578 } 00:18:51.578 ], 00:18:51.578 "driver_specific": { 00:18:51.578 "passthru": { 00:18:51.578 "name": "pt1", 00:18:51.578 "base_bdev_name": "malloc1" 00:18:51.578 } 00:18:51.578 } 00:18:51.578 }' 00:18:51.578 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:51.578 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:51.578 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:51.578 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:51.578 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:51.578 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:51.578 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:51.839 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:51.839 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:51.839 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:51.839 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:51.839 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:51.839 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:51.839 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:51.839 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.101 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.101 "name": "pt2", 00:18:52.101 "aliases": [ 00:18:52.101 "00000000-0000-0000-0000-000000000002" 00:18:52.101 ], 00:18:52.101 "product_name": "passthru", 00:18:52.101 "block_size": 512, 00:18:52.101 "num_blocks": 65536, 00:18:52.101 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:52.101 "assigned_rate_limits": { 00:18:52.101 "rw_ios_per_sec": 0, 00:18:52.101 "rw_mbytes_per_sec": 0, 00:18:52.101 "r_mbytes_per_sec": 0, 00:18:52.101 "w_mbytes_per_sec": 0 00:18:52.101 }, 00:18:52.101 "claimed": true, 00:18:52.101 "claim_type": "exclusive_write", 00:18:52.101 "zoned": false, 00:18:52.101 "supported_io_types": { 00:18:52.101 "read": true, 00:18:52.101 "write": true, 00:18:52.101 "unmap": true, 00:18:52.101 "flush": true, 00:18:52.101 "reset": true, 00:18:52.101 "nvme_admin": false, 00:18:52.101 "nvme_io": false, 00:18:52.101 "nvme_io_md": false, 00:18:52.101 "write_zeroes": true, 00:18:52.101 "zcopy": true, 00:18:52.101 "get_zone_info": false, 00:18:52.101 "zone_management": false, 00:18:52.101 "zone_append": false, 00:18:52.101 "compare": false, 00:18:52.101 "compare_and_write": false, 00:18:52.101 "abort": true, 00:18:52.101 "seek_hole": false, 00:18:52.101 "seek_data": false, 00:18:52.101 "copy": true, 00:18:52.101 "nvme_iov_md": false 00:18:52.101 }, 00:18:52.101 "memory_domains": [ 00:18:52.101 { 00:18:52.101 "dma_device_id": "system", 00:18:52.101 "dma_device_type": 1 00:18:52.101 }, 00:18:52.101 { 00:18:52.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.101 "dma_device_type": 2 00:18:52.101 } 00:18:52.101 ], 00:18:52.101 "driver_specific": { 00:18:52.101 "passthru": { 00:18:52.101 "name": "pt2", 00:18:52.101 "base_bdev_name": "malloc2" 00:18:52.101 } 00:18:52.101 } 00:18:52.101 }' 00:18:52.101 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.101 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.101 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.101 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.101 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.362 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.362 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.362 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.362 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.362 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.362 07:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.362 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.362 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.362 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.362 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:52.622 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.622 "name": "pt3", 00:18:52.622 "aliases": [ 00:18:52.622 "00000000-0000-0000-0000-000000000003" 00:18:52.622 ], 00:18:52.622 "product_name": "passthru", 00:18:52.622 "block_size": 512, 00:18:52.622 "num_blocks": 65536, 00:18:52.622 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:52.622 "assigned_rate_limits": { 00:18:52.622 "rw_ios_per_sec": 0, 00:18:52.622 "rw_mbytes_per_sec": 0, 00:18:52.622 "r_mbytes_per_sec": 0, 00:18:52.622 "w_mbytes_per_sec": 0 00:18:52.622 }, 00:18:52.622 "claimed": true, 00:18:52.622 "claim_type": "exclusive_write", 00:18:52.622 "zoned": false, 00:18:52.622 "supported_io_types": { 00:18:52.622 "read": true, 00:18:52.622 "write": true, 00:18:52.622 "unmap": true, 00:18:52.622 "flush": true, 00:18:52.622 "reset": true, 00:18:52.622 "nvme_admin": false, 00:18:52.622 "nvme_io": false, 00:18:52.622 "nvme_io_md": false, 00:18:52.622 "write_zeroes": true, 00:18:52.622 "zcopy": true, 00:18:52.622 "get_zone_info": false, 00:18:52.622 "zone_management": false, 00:18:52.622 "zone_append": false, 00:18:52.622 "compare": false, 00:18:52.622 "compare_and_write": false, 00:18:52.622 "abort": true, 00:18:52.622 "seek_hole": false, 00:18:52.622 "seek_data": false, 00:18:52.622 "copy": true, 00:18:52.622 "nvme_iov_md": false 00:18:52.622 }, 00:18:52.622 "memory_domains": [ 00:18:52.622 { 00:18:52.622 "dma_device_id": "system", 00:18:52.622 "dma_device_type": 1 00:18:52.622 }, 00:18:52.622 { 00:18:52.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.622 "dma_device_type": 2 00:18:52.622 } 00:18:52.622 ], 00:18:52.622 "driver_specific": { 00:18:52.622 "passthru": { 00:18:52.622 "name": "pt3", 00:18:52.622 "base_bdev_name": "malloc3" 00:18:52.622 } 00:18:52.622 } 00:18:52.622 }' 00:18:52.622 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.622 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.622 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.622 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.622 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:52.882 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.142 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.142 "name": "pt4", 00:18:53.142 "aliases": [ 00:18:53.142 "00000000-0000-0000-0000-000000000004" 00:18:53.142 ], 00:18:53.142 "product_name": "passthru", 00:18:53.142 "block_size": 512, 00:18:53.142 "num_blocks": 65536, 00:18:53.142 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:53.142 "assigned_rate_limits": { 00:18:53.142 "rw_ios_per_sec": 0, 00:18:53.142 "rw_mbytes_per_sec": 0, 00:18:53.142 "r_mbytes_per_sec": 0, 00:18:53.142 "w_mbytes_per_sec": 0 00:18:53.142 }, 00:18:53.142 "claimed": true, 00:18:53.142 "claim_type": "exclusive_write", 00:18:53.142 "zoned": false, 00:18:53.142 "supported_io_types": { 00:18:53.142 "read": true, 00:18:53.142 "write": true, 00:18:53.142 "unmap": true, 00:18:53.142 "flush": true, 00:18:53.142 "reset": true, 00:18:53.142 "nvme_admin": false, 00:18:53.142 "nvme_io": false, 00:18:53.142 "nvme_io_md": false, 00:18:53.142 "write_zeroes": true, 00:18:53.142 "zcopy": true, 00:18:53.142 "get_zone_info": false, 00:18:53.142 "zone_management": false, 00:18:53.142 "zone_append": false, 00:18:53.142 "compare": false, 00:18:53.142 "compare_and_write": false, 00:18:53.142 "abort": true, 00:18:53.142 "seek_hole": false, 00:18:53.142 "seek_data": false, 00:18:53.142 "copy": true, 00:18:53.142 "nvme_iov_md": false 00:18:53.142 }, 00:18:53.142 "memory_domains": [ 00:18:53.142 { 00:18:53.142 "dma_device_id": "system", 00:18:53.142 "dma_device_type": 1 00:18:53.142 }, 00:18:53.142 { 00:18:53.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.142 "dma_device_type": 2 00:18:53.142 } 00:18:53.142 ], 00:18:53.142 "driver_specific": { 00:18:53.142 "passthru": { 00:18:53.142 "name": "pt4", 00:18:53.142 "base_bdev_name": "malloc4" 00:18:53.142 } 00:18:53.142 } 00:18:53.142 }' 00:18:53.142 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.142 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.142 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.142 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.402 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.402 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:53.402 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.402 07:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.402 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.402 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.402 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.402 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.402 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:53.402 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:53.663 [2024-07-15 07:54:38.288538] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:53.663 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d99ce06e-5f81-41e5-9412-94d6bdd8eca1 00:18:53.663 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d99ce06e-5f81-41e5-9412-94d6bdd8eca1 ']' 00:18:53.663 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:53.924 [2024-07-15 07:54:38.484791] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:53.924 [2024-07-15 07:54:38.484802] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:53.924 [2024-07-15 07:54:38.484838] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:53.924 [2024-07-15 07:54:38.484883] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:53.924 [2024-07-15 07:54:38.484890] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15f2e20 name raid_bdev1, state offline 00:18:53.924 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.924 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:54.183 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:54.183 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:54.183 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:54.183 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:54.183 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:54.183 07:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:54.444 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:54.444 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:54.704 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:54.704 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:54.704 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:54.704 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:54.965 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:55.224 [2024-07-15 07:54:39.788046] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:55.224 [2024-07-15 07:54:39.789106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:55.224 [2024-07-15 07:54:39.789140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:55.224 [2024-07-15 07:54:39.789165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:55.224 [2024-07-15 07:54:39.789197] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:55.224 [2024-07-15 07:54:39.789222] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:55.224 [2024-07-15 07:54:39.789236] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:55.224 [2024-07-15 07:54:39.789248] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:55.224 [2024-07-15 07:54:39.789258] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:55.224 [2024-07-15 07:54:39.789264] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15f9350 name raid_bdev1, state configuring 00:18:55.224 request: 00:18:55.224 { 00:18:55.224 "name": "raid_bdev1", 00:18:55.224 "raid_level": "concat", 00:18:55.224 "base_bdevs": [ 00:18:55.224 "malloc1", 00:18:55.224 "malloc2", 00:18:55.224 "malloc3", 00:18:55.224 "malloc4" 00:18:55.224 ], 00:18:55.224 "strip_size_kb": 64, 00:18:55.224 "superblock": false, 00:18:55.224 "method": "bdev_raid_create", 00:18:55.224 "req_id": 1 00:18:55.224 } 00:18:55.224 Got JSON-RPC error response 00:18:55.224 response: 00:18:55.224 { 00:18:55.224 "code": -17, 00:18:55.224 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:55.224 } 00:18:55.224 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:55.224 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:55.224 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:55.224 07:54:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:55.224 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.224 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:55.483 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:55.483 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:55.483 07:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:55.483 [2024-07-15 07:54:40.172976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:55.483 [2024-07-15 07:54:40.173011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:55.483 [2024-07-15 07:54:40.173022] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1448e00 00:18:55.483 [2024-07-15 07:54:40.173027] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:55.483 [2024-07-15 07:54:40.174303] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:55.483 [2024-07-15 07:54:40.174323] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:55.483 [2024-07-15 07:54:40.174372] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:55.483 [2024-07-15 07:54:40.174391] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:55.483 pt1 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.483 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.742 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.742 "name": "raid_bdev1", 00:18:55.742 "uuid": "d99ce06e-5f81-41e5-9412-94d6bdd8eca1", 00:18:55.742 "strip_size_kb": 64, 00:18:55.742 "state": "configuring", 00:18:55.742 "raid_level": "concat", 00:18:55.742 "superblock": true, 00:18:55.742 "num_base_bdevs": 4, 00:18:55.742 "num_base_bdevs_discovered": 1, 00:18:55.742 "num_base_bdevs_operational": 4, 00:18:55.742 "base_bdevs_list": [ 00:18:55.742 { 00:18:55.742 "name": "pt1", 00:18:55.742 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:55.742 "is_configured": true, 00:18:55.742 "data_offset": 2048, 00:18:55.742 "data_size": 63488 00:18:55.742 }, 00:18:55.742 { 00:18:55.742 "name": null, 00:18:55.742 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:55.742 "is_configured": false, 00:18:55.742 "data_offset": 2048, 00:18:55.742 "data_size": 63488 00:18:55.742 }, 00:18:55.742 { 00:18:55.742 "name": null, 00:18:55.742 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:55.742 "is_configured": false, 00:18:55.742 "data_offset": 2048, 00:18:55.742 "data_size": 63488 00:18:55.742 }, 00:18:55.742 { 00:18:55.742 "name": null, 00:18:55.742 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:55.742 "is_configured": false, 00:18:55.742 "data_offset": 2048, 00:18:55.742 "data_size": 63488 00:18:55.742 } 00:18:55.742 ] 00:18:55.742 }' 00:18:55.742 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.742 07:54:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.312 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:56.312 07:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:56.574 [2024-07-15 07:54:41.095364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:56.574 [2024-07-15 07:54:41.095428] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:56.574 [2024-07-15 07:54:41.095443] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15f0c80 00:18:56.574 [2024-07-15 07:54:41.095450] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:56.574 [2024-07-15 07:54:41.096035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:56.574 [2024-07-15 07:54:41.096058] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:56.574 [2024-07-15 07:54:41.096138] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:56.574 [2024-07-15 07:54:41.096156] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:56.574 pt2 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:56.574 [2024-07-15 07:54:41.299901] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.574 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.835 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.835 "name": "raid_bdev1", 00:18:56.835 "uuid": "d99ce06e-5f81-41e5-9412-94d6bdd8eca1", 00:18:56.835 "strip_size_kb": 64, 00:18:56.835 "state": "configuring", 00:18:56.835 "raid_level": "concat", 00:18:56.835 "superblock": true, 00:18:56.835 "num_base_bdevs": 4, 00:18:56.835 "num_base_bdevs_discovered": 1, 00:18:56.835 "num_base_bdevs_operational": 4, 00:18:56.835 "base_bdevs_list": [ 00:18:56.835 { 00:18:56.835 "name": "pt1", 00:18:56.835 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:56.835 "is_configured": true, 00:18:56.835 "data_offset": 2048, 00:18:56.835 "data_size": 63488 00:18:56.835 }, 00:18:56.835 { 00:18:56.835 "name": null, 00:18:56.835 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:56.835 "is_configured": false, 00:18:56.835 "data_offset": 2048, 00:18:56.835 "data_size": 63488 00:18:56.835 }, 00:18:56.835 { 00:18:56.835 "name": null, 00:18:56.836 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:56.836 "is_configured": false, 00:18:56.836 "data_offset": 2048, 00:18:56.836 "data_size": 63488 00:18:56.836 }, 00:18:56.836 { 00:18:56.836 "name": null, 00:18:56.836 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:56.836 "is_configured": false, 00:18:56.836 "data_offset": 2048, 00:18:56.836 "data_size": 63488 00:18:56.836 } 00:18:56.836 ] 00:18:56.836 }' 00:18:56.836 07:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.836 07:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.406 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:57.406 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:57.406 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:57.667 [2024-07-15 07:54:42.254298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:57.667 [2024-07-15 07:54:42.254352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:57.667 [2024-07-15 07:54:42.254368] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15f7570 00:18:57.667 [2024-07-15 07:54:42.254375] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:57.667 [2024-07-15 07:54:42.254745] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:57.667 [2024-07-15 07:54:42.254758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:57.667 [2024-07-15 07:54:42.254822] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:57.667 [2024-07-15 07:54:42.254838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:57.667 pt2 00:18:57.667 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:57.667 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:57.667 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:57.928 [2024-07-15 07:54:42.470831] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:57.928 [2024-07-15 07:54:42.470854] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:57.928 [2024-07-15 07:54:42.470864] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15f7b00 00:18:57.928 [2024-07-15 07:54:42.470870] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:57.928 [2024-07-15 07:54:42.471122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:57.928 [2024-07-15 07:54:42.471133] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:57.928 [2024-07-15 07:54:42.471172] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:57.928 [2024-07-15 07:54:42.471183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:57.928 pt3 00:18:57.928 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:57.928 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:57.928 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:58.189 [2024-07-15 07:54:42.687375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:58.189 [2024-07-15 07:54:42.687397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.189 [2024-07-15 07:54:42.687407] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1446450 00:18:58.189 [2024-07-15 07:54:42.687413] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.189 [2024-07-15 07:54:42.687638] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.189 [2024-07-15 07:54:42.687648] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:58.189 [2024-07-15 07:54:42.687683] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:58.189 [2024-07-15 07:54:42.687694] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:58.189 [2024-07-15 07:54:42.687803] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15f2840 00:18:58.189 [2024-07-15 07:54:42.687810] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:58.189 [2024-07-15 07:54:42.687960] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15f6e20 00:18:58.189 [2024-07-15 07:54:42.688078] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15f2840 00:18:58.189 [2024-07-15 07:54:42.688083] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15f2840 00:18:58.189 [2024-07-15 07:54:42.688160] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:58.189 pt4 00:18:58.189 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:58.189 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:58.189 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:18:58.189 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:58.189 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:58.189 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:58.189 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:58.190 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:58.190 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:58.190 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:58.190 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:58.190 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:58.190 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.190 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:58.450 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:58.450 "name": "raid_bdev1", 00:18:58.450 "uuid": "d99ce06e-5f81-41e5-9412-94d6bdd8eca1", 00:18:58.450 "strip_size_kb": 64, 00:18:58.450 "state": "online", 00:18:58.450 "raid_level": "concat", 00:18:58.450 "superblock": true, 00:18:58.450 "num_base_bdevs": 4, 00:18:58.450 "num_base_bdevs_discovered": 4, 00:18:58.450 "num_base_bdevs_operational": 4, 00:18:58.450 "base_bdevs_list": [ 00:18:58.450 { 00:18:58.450 "name": "pt1", 00:18:58.450 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:58.450 "is_configured": true, 00:18:58.450 "data_offset": 2048, 00:18:58.450 "data_size": 63488 00:18:58.450 }, 00:18:58.450 { 00:18:58.450 "name": "pt2", 00:18:58.450 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:58.450 "is_configured": true, 00:18:58.450 "data_offset": 2048, 00:18:58.450 "data_size": 63488 00:18:58.450 }, 00:18:58.450 { 00:18:58.450 "name": "pt3", 00:18:58.450 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:58.450 "is_configured": true, 00:18:58.450 "data_offset": 2048, 00:18:58.450 "data_size": 63488 00:18:58.450 }, 00:18:58.450 { 00:18:58.450 "name": "pt4", 00:18:58.450 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:58.450 "is_configured": true, 00:18:58.450 "data_offset": 2048, 00:18:58.450 "data_size": 63488 00:18:58.450 } 00:18:58.450 ] 00:18:58.450 }' 00:18:58.450 07:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:58.450 07:54:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:59.019 [2024-07-15 07:54:43.678154] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:59.019 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:59.019 "name": "raid_bdev1", 00:18:59.019 "aliases": [ 00:18:59.019 "d99ce06e-5f81-41e5-9412-94d6bdd8eca1" 00:18:59.019 ], 00:18:59.019 "product_name": "Raid Volume", 00:18:59.019 "block_size": 512, 00:18:59.019 "num_blocks": 253952, 00:18:59.019 "uuid": "d99ce06e-5f81-41e5-9412-94d6bdd8eca1", 00:18:59.019 "assigned_rate_limits": { 00:18:59.019 "rw_ios_per_sec": 0, 00:18:59.019 "rw_mbytes_per_sec": 0, 00:18:59.019 "r_mbytes_per_sec": 0, 00:18:59.019 "w_mbytes_per_sec": 0 00:18:59.019 }, 00:18:59.020 "claimed": false, 00:18:59.020 "zoned": false, 00:18:59.020 "supported_io_types": { 00:18:59.020 "read": true, 00:18:59.020 "write": true, 00:18:59.020 "unmap": true, 00:18:59.020 "flush": true, 00:18:59.020 "reset": true, 00:18:59.020 "nvme_admin": false, 00:18:59.020 "nvme_io": false, 00:18:59.020 "nvme_io_md": false, 00:18:59.020 "write_zeroes": true, 00:18:59.020 "zcopy": false, 00:18:59.020 "get_zone_info": false, 00:18:59.020 "zone_management": false, 00:18:59.020 "zone_append": false, 00:18:59.020 "compare": false, 00:18:59.020 "compare_and_write": false, 00:18:59.020 "abort": false, 00:18:59.020 "seek_hole": false, 00:18:59.020 "seek_data": false, 00:18:59.020 "copy": false, 00:18:59.020 "nvme_iov_md": false 00:18:59.020 }, 00:18:59.020 "memory_domains": [ 00:18:59.020 { 00:18:59.020 "dma_device_id": "system", 00:18:59.020 "dma_device_type": 1 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.020 "dma_device_type": 2 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "dma_device_id": "system", 00:18:59.020 "dma_device_type": 1 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.020 "dma_device_type": 2 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "dma_device_id": "system", 00:18:59.020 "dma_device_type": 1 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.020 "dma_device_type": 2 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "dma_device_id": "system", 00:18:59.020 "dma_device_type": 1 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.020 "dma_device_type": 2 00:18:59.020 } 00:18:59.020 ], 00:18:59.020 "driver_specific": { 00:18:59.020 "raid": { 00:18:59.020 "uuid": "d99ce06e-5f81-41e5-9412-94d6bdd8eca1", 00:18:59.020 "strip_size_kb": 64, 00:18:59.020 "state": "online", 00:18:59.020 "raid_level": "concat", 00:18:59.020 "superblock": true, 00:18:59.020 "num_base_bdevs": 4, 00:18:59.020 "num_base_bdevs_discovered": 4, 00:18:59.020 "num_base_bdevs_operational": 4, 00:18:59.020 "base_bdevs_list": [ 00:18:59.020 { 00:18:59.020 "name": "pt1", 00:18:59.020 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:59.020 "is_configured": true, 00:18:59.020 "data_offset": 2048, 00:18:59.020 "data_size": 63488 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "name": "pt2", 00:18:59.020 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:59.020 "is_configured": true, 00:18:59.020 "data_offset": 2048, 00:18:59.020 "data_size": 63488 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "name": "pt3", 00:18:59.020 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:59.020 "is_configured": true, 00:18:59.020 "data_offset": 2048, 00:18:59.020 "data_size": 63488 00:18:59.020 }, 00:18:59.020 { 00:18:59.020 "name": "pt4", 00:18:59.020 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:59.020 "is_configured": true, 00:18:59.020 "data_offset": 2048, 00:18:59.020 "data_size": 63488 00:18:59.020 } 00:18:59.020 ] 00:18:59.020 } 00:18:59.020 } 00:18:59.020 }' 00:18:59.020 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:59.020 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:59.020 pt2 00:18:59.020 pt3 00:18:59.020 pt4' 00:18:59.020 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:59.020 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:59.020 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:59.280 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:59.280 "name": "pt1", 00:18:59.280 "aliases": [ 00:18:59.280 "00000000-0000-0000-0000-000000000001" 00:18:59.280 ], 00:18:59.280 "product_name": "passthru", 00:18:59.280 "block_size": 512, 00:18:59.280 "num_blocks": 65536, 00:18:59.280 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:59.280 "assigned_rate_limits": { 00:18:59.280 "rw_ios_per_sec": 0, 00:18:59.280 "rw_mbytes_per_sec": 0, 00:18:59.280 "r_mbytes_per_sec": 0, 00:18:59.280 "w_mbytes_per_sec": 0 00:18:59.280 }, 00:18:59.280 "claimed": true, 00:18:59.280 "claim_type": "exclusive_write", 00:18:59.280 "zoned": false, 00:18:59.280 "supported_io_types": { 00:18:59.280 "read": true, 00:18:59.280 "write": true, 00:18:59.280 "unmap": true, 00:18:59.280 "flush": true, 00:18:59.280 "reset": true, 00:18:59.280 "nvme_admin": false, 00:18:59.280 "nvme_io": false, 00:18:59.280 "nvme_io_md": false, 00:18:59.280 "write_zeroes": true, 00:18:59.280 "zcopy": true, 00:18:59.280 "get_zone_info": false, 00:18:59.280 "zone_management": false, 00:18:59.280 "zone_append": false, 00:18:59.280 "compare": false, 00:18:59.280 "compare_and_write": false, 00:18:59.280 "abort": true, 00:18:59.280 "seek_hole": false, 00:18:59.280 "seek_data": false, 00:18:59.280 "copy": true, 00:18:59.280 "nvme_iov_md": false 00:18:59.280 }, 00:18:59.280 "memory_domains": [ 00:18:59.280 { 00:18:59.280 "dma_device_id": "system", 00:18:59.280 "dma_device_type": 1 00:18:59.280 }, 00:18:59.280 { 00:18:59.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.280 "dma_device_type": 2 00:18:59.280 } 00:18:59.280 ], 00:18:59.280 "driver_specific": { 00:18:59.280 "passthru": { 00:18:59.280 "name": "pt1", 00:18:59.280 "base_bdev_name": "malloc1" 00:18:59.280 } 00:18:59.280 } 00:18:59.280 }' 00:18:59.280 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.280 07:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.539 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.799 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:59.799 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:59.799 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:59.799 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:59.799 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:59.799 "name": "pt2", 00:18:59.799 "aliases": [ 00:18:59.799 "00000000-0000-0000-0000-000000000002" 00:18:59.799 ], 00:18:59.799 "product_name": "passthru", 00:18:59.799 "block_size": 512, 00:18:59.799 "num_blocks": 65536, 00:18:59.799 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:59.799 "assigned_rate_limits": { 00:18:59.799 "rw_ios_per_sec": 0, 00:18:59.799 "rw_mbytes_per_sec": 0, 00:18:59.799 "r_mbytes_per_sec": 0, 00:18:59.799 "w_mbytes_per_sec": 0 00:18:59.799 }, 00:18:59.799 "claimed": true, 00:18:59.799 "claim_type": "exclusive_write", 00:18:59.799 "zoned": false, 00:18:59.799 "supported_io_types": { 00:18:59.799 "read": true, 00:18:59.799 "write": true, 00:18:59.799 "unmap": true, 00:18:59.799 "flush": true, 00:18:59.799 "reset": true, 00:18:59.799 "nvme_admin": false, 00:18:59.799 "nvme_io": false, 00:18:59.799 "nvme_io_md": false, 00:18:59.799 "write_zeroes": true, 00:18:59.799 "zcopy": true, 00:18:59.799 "get_zone_info": false, 00:18:59.799 "zone_management": false, 00:18:59.799 "zone_append": false, 00:18:59.799 "compare": false, 00:18:59.799 "compare_and_write": false, 00:18:59.799 "abort": true, 00:18:59.799 "seek_hole": false, 00:18:59.799 "seek_data": false, 00:18:59.799 "copy": true, 00:18:59.799 "nvme_iov_md": false 00:18:59.799 }, 00:18:59.799 "memory_domains": [ 00:18:59.799 { 00:18:59.799 "dma_device_id": "system", 00:18:59.799 "dma_device_type": 1 00:18:59.799 }, 00:18:59.799 { 00:18:59.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.799 "dma_device_type": 2 00:18:59.799 } 00:18:59.799 ], 00:18:59.799 "driver_specific": { 00:18:59.799 "passthru": { 00:18:59.799 "name": "pt2", 00:18:59.799 "base_bdev_name": "malloc2" 00:18:59.799 } 00:18:59.799 } 00:18:59.799 }' 00:18:59.799 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:00.059 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:00.319 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:00.319 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:00.319 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:00.319 07:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:00.319 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:00.319 "name": "pt3", 00:19:00.319 "aliases": [ 00:19:00.319 "00000000-0000-0000-0000-000000000003" 00:19:00.319 ], 00:19:00.319 "product_name": "passthru", 00:19:00.319 "block_size": 512, 00:19:00.319 "num_blocks": 65536, 00:19:00.319 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:00.319 "assigned_rate_limits": { 00:19:00.319 "rw_ios_per_sec": 0, 00:19:00.319 "rw_mbytes_per_sec": 0, 00:19:00.319 "r_mbytes_per_sec": 0, 00:19:00.319 "w_mbytes_per_sec": 0 00:19:00.319 }, 00:19:00.319 "claimed": true, 00:19:00.319 "claim_type": "exclusive_write", 00:19:00.319 "zoned": false, 00:19:00.319 "supported_io_types": { 00:19:00.319 "read": true, 00:19:00.319 "write": true, 00:19:00.319 "unmap": true, 00:19:00.320 "flush": true, 00:19:00.320 "reset": true, 00:19:00.320 "nvme_admin": false, 00:19:00.320 "nvme_io": false, 00:19:00.320 "nvme_io_md": false, 00:19:00.320 "write_zeroes": true, 00:19:00.320 "zcopy": true, 00:19:00.320 "get_zone_info": false, 00:19:00.320 "zone_management": false, 00:19:00.320 "zone_append": false, 00:19:00.320 "compare": false, 00:19:00.320 "compare_and_write": false, 00:19:00.320 "abort": true, 00:19:00.320 "seek_hole": false, 00:19:00.320 "seek_data": false, 00:19:00.320 "copy": true, 00:19:00.320 "nvme_iov_md": false 00:19:00.320 }, 00:19:00.320 "memory_domains": [ 00:19:00.320 { 00:19:00.320 "dma_device_id": "system", 00:19:00.320 "dma_device_type": 1 00:19:00.320 }, 00:19:00.320 { 00:19:00.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:00.320 "dma_device_type": 2 00:19:00.320 } 00:19:00.320 ], 00:19:00.320 "driver_specific": { 00:19:00.320 "passthru": { 00:19:00.320 "name": "pt3", 00:19:00.320 "base_bdev_name": "malloc3" 00:19:00.320 } 00:19:00.320 } 00:19:00.320 }' 00:19:00.320 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:00.320 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:00.581 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:00.841 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:00.841 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:00.841 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:00.841 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:00.841 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:00.841 "name": "pt4", 00:19:00.841 "aliases": [ 00:19:00.841 "00000000-0000-0000-0000-000000000004" 00:19:00.841 ], 00:19:00.841 "product_name": "passthru", 00:19:00.841 "block_size": 512, 00:19:00.841 "num_blocks": 65536, 00:19:00.841 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:00.841 "assigned_rate_limits": { 00:19:00.841 "rw_ios_per_sec": 0, 00:19:00.841 "rw_mbytes_per_sec": 0, 00:19:00.841 "r_mbytes_per_sec": 0, 00:19:00.841 "w_mbytes_per_sec": 0 00:19:00.841 }, 00:19:00.841 "claimed": true, 00:19:00.841 "claim_type": "exclusive_write", 00:19:00.841 "zoned": false, 00:19:00.841 "supported_io_types": { 00:19:00.841 "read": true, 00:19:00.841 "write": true, 00:19:00.841 "unmap": true, 00:19:00.841 "flush": true, 00:19:00.841 "reset": true, 00:19:00.841 "nvme_admin": false, 00:19:00.841 "nvme_io": false, 00:19:00.841 "nvme_io_md": false, 00:19:00.841 "write_zeroes": true, 00:19:00.841 "zcopy": true, 00:19:00.841 "get_zone_info": false, 00:19:00.841 "zone_management": false, 00:19:00.841 "zone_append": false, 00:19:00.841 "compare": false, 00:19:00.841 "compare_and_write": false, 00:19:00.841 "abort": true, 00:19:00.841 "seek_hole": false, 00:19:00.841 "seek_data": false, 00:19:00.841 "copy": true, 00:19:00.841 "nvme_iov_md": false 00:19:00.841 }, 00:19:00.841 "memory_domains": [ 00:19:00.841 { 00:19:00.841 "dma_device_id": "system", 00:19:00.841 "dma_device_type": 1 00:19:00.841 }, 00:19:00.841 { 00:19:00.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:00.841 "dma_device_type": 2 00:19:00.841 } 00:19:00.841 ], 00:19:00.841 "driver_specific": { 00:19:00.841 "passthru": { 00:19:00.841 "name": "pt4", 00:19:00.841 "base_bdev_name": "malloc4" 00:19:00.841 } 00:19:00.841 } 00:19:00.841 }' 00:19:00.841 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:01.103 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:01.363 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:01.363 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:01.363 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:01.363 07:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:01.363 [2024-07-15 07:54:46.096261] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:01.363 07:54:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d99ce06e-5f81-41e5-9412-94d6bdd8eca1 '!=' d99ce06e-5f81-41e5-9412-94d6bdd8eca1 ']' 00:19:01.363 07:54:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:19:01.363 07:54:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:01.363 07:54:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:01.363 07:54:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1681807 00:19:01.363 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1681807 ']' 00:19:01.363 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1681807 00:19:01.363 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1681807 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1681807' 00:19:01.624 killing process with pid 1681807 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1681807 00:19:01.624 [2024-07-15 07:54:46.166813] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:01.624 [2024-07-15 07:54:46.166863] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:01.624 [2024-07-15 07:54:46.166914] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:01.624 [2024-07-15 07:54:46.166921] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15f2840 name raid_bdev1, state offline 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1681807 00:19:01.624 [2024-07-15 07:54:46.186819] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:01.624 00:19:01.624 real 0m14.276s 00:19:01.624 user 0m26.302s 00:19:01.624 sys 0m2.135s 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:01.624 07:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:01.624 ************************************ 00:19:01.624 END TEST raid_superblock_test 00:19:01.624 ************************************ 00:19:01.624 07:54:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:01.624 07:54:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:19:01.624 07:54:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:01.624 07:54:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:01.624 07:54:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:01.886 ************************************ 00:19:01.886 START TEST raid_read_error_test 00:19:01.886 ************************************ 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.V61AWxbEMn 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1684395 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1684395 /var/tmp/spdk-raid.sock 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1684395 ']' 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:01.886 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:01.886 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:01.886 [2024-07-15 07:54:46.506357] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:19:01.886 [2024-07-15 07:54:46.506484] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1684395 ] 00:19:02.148 [2024-07-15 07:54:46.650561] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:02.148 [2024-07-15 07:54:46.744577] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:02.148 [2024-07-15 07:54:46.803527] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:02.148 [2024-07-15 07:54:46.803561] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:02.408 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:02.408 07:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:02.408 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:02.408 07:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:02.668 BaseBdev1_malloc 00:19:02.668 07:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:02.668 true 00:19:02.668 07:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:02.929 [2024-07-15 07:54:47.549068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:02.929 [2024-07-15 07:54:47.549113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:02.929 [2024-07-15 07:54:47.549126] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad6b50 00:19:02.929 [2024-07-15 07:54:47.549134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:02.929 [2024-07-15 07:54:47.550676] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:02.929 [2024-07-15 07:54:47.550721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:02.929 BaseBdev1 00:19:02.929 07:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:02.929 07:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:03.190 BaseBdev2_malloc 00:19:03.190 07:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:03.450 true 00:19:03.450 07:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:03.450 [2024-07-15 07:54:48.167947] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:03.450 [2024-07-15 07:54:48.167990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:03.450 [2024-07-15 07:54:48.168003] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1abaea0 00:19:03.450 [2024-07-15 07:54:48.168019] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:03.450 [2024-07-15 07:54:48.169402] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:03.450 [2024-07-15 07:54:48.169436] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:03.450 BaseBdev2 00:19:03.450 07:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:03.450 07:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:03.711 BaseBdev3_malloc 00:19:03.711 07:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:03.971 true 00:19:03.971 07:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:04.232 [2024-07-15 07:54:48.782833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:04.232 [2024-07-15 07:54:48.782878] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.232 [2024-07-15 07:54:48.782894] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1abefb0 00:19:04.232 [2024-07-15 07:54:48.782901] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.232 [2024-07-15 07:54:48.784303] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.232 [2024-07-15 07:54:48.784336] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:04.232 BaseBdev3 00:19:04.232 07:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:04.232 07:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:04.498 BaseBdev4_malloc 00:19:04.498 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:04.498 true 00:19:04.498 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:04.769 [2024-07-15 07:54:49.402455] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:04.769 [2024-07-15 07:54:49.402497] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.769 [2024-07-15 07:54:49.402511] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ac0980 00:19:04.769 [2024-07-15 07:54:49.402519] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.769 [2024-07-15 07:54:49.403908] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.769 [2024-07-15 07:54:49.403941] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:04.769 BaseBdev4 00:19:04.769 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:05.035 [2024-07-15 07:54:49.607011] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:05.035 [2024-07-15 07:54:49.608197] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:05.035 [2024-07-15 07:54:49.608262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:05.035 [2024-07-15 07:54:49.608309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:05.035 [2024-07-15 07:54:49.608507] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ac04e0 00:19:05.035 [2024-07-15 07:54:49.608515] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:05.035 [2024-07-15 07:54:49.608692] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1922210 00:19:05.035 [2024-07-15 07:54:49.608840] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ac04e0 00:19:05.035 [2024-07-15 07:54:49.608847] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ac04e0 00:19:05.035 [2024-07-15 07:54:49.608930] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.035 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.296 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.296 "name": "raid_bdev1", 00:19:05.296 "uuid": "0e5b707f-385c-4a20-a42a-3f66034c6840", 00:19:05.296 "strip_size_kb": 64, 00:19:05.296 "state": "online", 00:19:05.296 "raid_level": "concat", 00:19:05.296 "superblock": true, 00:19:05.296 "num_base_bdevs": 4, 00:19:05.296 "num_base_bdevs_discovered": 4, 00:19:05.296 "num_base_bdevs_operational": 4, 00:19:05.296 "base_bdevs_list": [ 00:19:05.296 { 00:19:05.296 "name": "BaseBdev1", 00:19:05.296 "uuid": "254dc616-07a0-501c-b1d2-78c35b5b7e79", 00:19:05.296 "is_configured": true, 00:19:05.296 "data_offset": 2048, 00:19:05.296 "data_size": 63488 00:19:05.296 }, 00:19:05.296 { 00:19:05.296 "name": "BaseBdev2", 00:19:05.296 "uuid": "6d6a1b6b-56ca-507d-852c-dbc8aa168b2c", 00:19:05.296 "is_configured": true, 00:19:05.296 "data_offset": 2048, 00:19:05.296 "data_size": 63488 00:19:05.296 }, 00:19:05.296 { 00:19:05.296 "name": "BaseBdev3", 00:19:05.296 "uuid": "3bff8c0d-e277-593c-affe-db75abc6887b", 00:19:05.296 "is_configured": true, 00:19:05.296 "data_offset": 2048, 00:19:05.296 "data_size": 63488 00:19:05.296 }, 00:19:05.296 { 00:19:05.296 "name": "BaseBdev4", 00:19:05.296 "uuid": "2e35e911-396b-5a5d-8808-02a17a6ba4f6", 00:19:05.296 "is_configured": true, 00:19:05.296 "data_offset": 2048, 00:19:05.296 "data_size": 63488 00:19:05.296 } 00:19:05.296 ] 00:19:05.296 }' 00:19:05.296 07:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.296 07:54:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.867 07:54:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:05.867 07:54:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:05.867 [2024-07-15 07:54:50.481502] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac6170 00:19:06.808 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.069 "name": "raid_bdev1", 00:19:07.069 "uuid": "0e5b707f-385c-4a20-a42a-3f66034c6840", 00:19:07.069 "strip_size_kb": 64, 00:19:07.069 "state": "online", 00:19:07.069 "raid_level": "concat", 00:19:07.069 "superblock": true, 00:19:07.069 "num_base_bdevs": 4, 00:19:07.069 "num_base_bdevs_discovered": 4, 00:19:07.069 "num_base_bdevs_operational": 4, 00:19:07.069 "base_bdevs_list": [ 00:19:07.069 { 00:19:07.069 "name": "BaseBdev1", 00:19:07.069 "uuid": "254dc616-07a0-501c-b1d2-78c35b5b7e79", 00:19:07.069 "is_configured": true, 00:19:07.069 "data_offset": 2048, 00:19:07.069 "data_size": 63488 00:19:07.069 }, 00:19:07.069 { 00:19:07.069 "name": "BaseBdev2", 00:19:07.069 "uuid": "6d6a1b6b-56ca-507d-852c-dbc8aa168b2c", 00:19:07.069 "is_configured": true, 00:19:07.069 "data_offset": 2048, 00:19:07.069 "data_size": 63488 00:19:07.069 }, 00:19:07.069 { 00:19:07.069 "name": "BaseBdev3", 00:19:07.069 "uuid": "3bff8c0d-e277-593c-affe-db75abc6887b", 00:19:07.069 "is_configured": true, 00:19:07.069 "data_offset": 2048, 00:19:07.069 "data_size": 63488 00:19:07.069 }, 00:19:07.069 { 00:19:07.069 "name": "BaseBdev4", 00:19:07.069 "uuid": "2e35e911-396b-5a5d-8808-02a17a6ba4f6", 00:19:07.069 "is_configured": true, 00:19:07.069 "data_offset": 2048, 00:19:07.069 "data_size": 63488 00:19:07.069 } 00:19:07.069 ] 00:19:07.069 }' 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.069 07:54:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.639 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:07.898 [2024-07-15 07:54:52.486441] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:07.898 [2024-07-15 07:54:52.486469] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:07.898 [2024-07-15 07:54:52.489051] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:07.898 [2024-07-15 07:54:52.489080] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:07.898 [2024-07-15 07:54:52.489109] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:07.898 [2024-07-15 07:54:52.489115] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ac04e0 name raid_bdev1, state offline 00:19:07.898 0 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1684395 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1684395 ']' 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1684395 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1684395 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1684395' 00:19:07.898 killing process with pid 1684395 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1684395 00:19:07.898 [2024-07-15 07:54:52.572100] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:07.898 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1684395 00:19:07.898 [2024-07-15 07:54:52.589178] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.V61AWxbEMn 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:19:08.158 00:19:08.158 real 0m6.334s 00:19:08.158 user 0m10.456s 00:19:08.158 sys 0m1.078s 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:08.158 07:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.158 ************************************ 00:19:08.158 END TEST raid_read_error_test 00:19:08.158 ************************************ 00:19:08.158 07:54:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:08.158 07:54:52 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:19:08.158 07:54:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:08.158 07:54:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:08.158 07:54:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:08.158 ************************************ 00:19:08.158 START TEST raid_write_error_test 00:19:08.158 ************************************ 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:08.158 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.zCu2FxKlWL 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1685695 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1685695 /var/tmp/spdk-raid.sock 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1685695 ']' 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:08.159 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:08.159 07:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.159 [2024-07-15 07:54:52.868422] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:19:08.159 [2024-07-15 07:54:52.868467] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1685695 ] 00:19:08.418 [2024-07-15 07:54:52.954961] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:08.418 [2024-07-15 07:54:53.018972] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:08.418 [2024-07-15 07:54:53.060045] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:08.418 [2024-07-15 07:54:53.060070] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:08.985 07:54:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:08.986 07:54:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:08.986 07:54:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:08.986 07:54:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:09.245 BaseBdev1_malloc 00:19:09.245 07:54:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:09.504 true 00:19:09.504 07:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:09.504 [2024-07-15 07:54:54.194535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:09.504 [2024-07-15 07:54:54.194568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.504 [2024-07-15 07:54:54.194579] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13a9b50 00:19:09.504 [2024-07-15 07:54:54.194585] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.504 [2024-07-15 07:54:54.195872] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.504 [2024-07-15 07:54:54.195890] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:09.504 BaseBdev1 00:19:09.504 07:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:09.504 07:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:09.764 BaseBdev2_malloc 00:19:09.764 07:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:10.023 true 00:19:10.023 07:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:10.023 [2024-07-15 07:54:54.745508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:10.023 [2024-07-15 07:54:54.745532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:10.023 [2024-07-15 07:54:54.745541] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x138dea0 00:19:10.023 [2024-07-15 07:54:54.745547] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:10.023 [2024-07-15 07:54:54.746684] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:10.023 [2024-07-15 07:54:54.746702] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:10.023 BaseBdev2 00:19:10.023 07:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:10.023 07:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:10.283 BaseBdev3_malloc 00:19:10.283 07:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:10.542 true 00:19:10.542 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:10.542 [2024-07-15 07:54:55.208262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:10.542 [2024-07-15 07:54:55.208286] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:10.542 [2024-07-15 07:54:55.208296] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1391fb0 00:19:10.542 [2024-07-15 07:54:55.208302] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:10.542 [2024-07-15 07:54:55.209437] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:10.542 [2024-07-15 07:54:55.209455] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:10.542 BaseBdev3 00:19:10.542 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:10.542 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:10.802 BaseBdev4_malloc 00:19:10.802 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:10.802 true 00:19:10.802 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:11.061 [2024-07-15 07:54:55.658925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:11.061 [2024-07-15 07:54:55.658946] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:11.061 [2024-07-15 07:54:55.658955] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1393980 00:19:11.061 [2024-07-15 07:54:55.658965] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:11.061 [2024-07-15 07:54:55.660097] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:11.061 [2024-07-15 07:54:55.660114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:11.061 BaseBdev4 00:19:11.061 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:11.061 [2024-07-15 07:54:55.807342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:11.061 [2024-07-15 07:54:55.808312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:11.061 [2024-07-15 07:54:55.808362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:11.061 [2024-07-15 07:54:55.808406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:11.061 [2024-07-15 07:54:55.808580] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13934e0 00:19:11.061 [2024-07-15 07:54:55.808587] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:11.061 [2024-07-15 07:54:55.808728] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f5210 00:19:11.061 [2024-07-15 07:54:55.808843] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13934e0 00:19:11.061 [2024-07-15 07:54:55.808849] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13934e0 00:19:11.061 [2024-07-15 07:54:55.808921] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.322 "name": "raid_bdev1", 00:19:11.322 "uuid": "d091ae98-b328-41ac-aa88-8c1edc22d055", 00:19:11.322 "strip_size_kb": 64, 00:19:11.322 "state": "online", 00:19:11.322 "raid_level": "concat", 00:19:11.322 "superblock": true, 00:19:11.322 "num_base_bdevs": 4, 00:19:11.322 "num_base_bdevs_discovered": 4, 00:19:11.322 "num_base_bdevs_operational": 4, 00:19:11.322 "base_bdevs_list": [ 00:19:11.322 { 00:19:11.322 "name": "BaseBdev1", 00:19:11.322 "uuid": "f438a353-fbdb-5ab5-8daa-8f8997402365", 00:19:11.322 "is_configured": true, 00:19:11.322 "data_offset": 2048, 00:19:11.322 "data_size": 63488 00:19:11.322 }, 00:19:11.322 { 00:19:11.322 "name": "BaseBdev2", 00:19:11.322 "uuid": "7b7ac12a-98f5-59c0-ae36-62387490e99c", 00:19:11.322 "is_configured": true, 00:19:11.322 "data_offset": 2048, 00:19:11.322 "data_size": 63488 00:19:11.322 }, 00:19:11.322 { 00:19:11.322 "name": "BaseBdev3", 00:19:11.322 "uuid": "62a9edb3-b21c-5d8f-a104-f4faa5845bd0", 00:19:11.322 "is_configured": true, 00:19:11.322 "data_offset": 2048, 00:19:11.322 "data_size": 63488 00:19:11.322 }, 00:19:11.322 { 00:19:11.322 "name": "BaseBdev4", 00:19:11.322 "uuid": "88d703f7-56ae-5107-9e9e-9169d12a9b2e", 00:19:11.322 "is_configured": true, 00:19:11.322 "data_offset": 2048, 00:19:11.322 "data_size": 63488 00:19:11.322 } 00:19:11.322 ] 00:19:11.322 }' 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.322 07:54:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.891 07:54:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:11.891 07:54:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:11.891 [2024-07-15 07:54:56.609589] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1399170 00:19:12.827 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.086 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:13.344 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:13.344 "name": "raid_bdev1", 00:19:13.344 "uuid": "d091ae98-b328-41ac-aa88-8c1edc22d055", 00:19:13.344 "strip_size_kb": 64, 00:19:13.344 "state": "online", 00:19:13.344 "raid_level": "concat", 00:19:13.345 "superblock": true, 00:19:13.345 "num_base_bdevs": 4, 00:19:13.345 "num_base_bdevs_discovered": 4, 00:19:13.345 "num_base_bdevs_operational": 4, 00:19:13.345 "base_bdevs_list": [ 00:19:13.345 { 00:19:13.345 "name": "BaseBdev1", 00:19:13.345 "uuid": "f438a353-fbdb-5ab5-8daa-8f8997402365", 00:19:13.345 "is_configured": true, 00:19:13.345 "data_offset": 2048, 00:19:13.345 "data_size": 63488 00:19:13.345 }, 00:19:13.345 { 00:19:13.345 "name": "BaseBdev2", 00:19:13.345 "uuid": "7b7ac12a-98f5-59c0-ae36-62387490e99c", 00:19:13.345 "is_configured": true, 00:19:13.345 "data_offset": 2048, 00:19:13.345 "data_size": 63488 00:19:13.345 }, 00:19:13.345 { 00:19:13.345 "name": "BaseBdev3", 00:19:13.345 "uuid": "62a9edb3-b21c-5d8f-a104-f4faa5845bd0", 00:19:13.345 "is_configured": true, 00:19:13.345 "data_offset": 2048, 00:19:13.345 "data_size": 63488 00:19:13.345 }, 00:19:13.345 { 00:19:13.345 "name": "BaseBdev4", 00:19:13.345 "uuid": "88d703f7-56ae-5107-9e9e-9169d12a9b2e", 00:19:13.345 "is_configured": true, 00:19:13.345 "data_offset": 2048, 00:19:13.345 "data_size": 63488 00:19:13.345 } 00:19:13.345 ] 00:19:13.345 }' 00:19:13.345 07:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:13.345 07:54:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:13.912 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:13.912 [2024-07-15 07:54:58.629280] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:13.912 [2024-07-15 07:54:58.629314] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:13.912 [2024-07-15 07:54:58.631897] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:13.912 [2024-07-15 07:54:58.631927] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:13.912 [2024-07-15 07:54:58.631955] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:13.912 [2024-07-15 07:54:58.631961] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13934e0 name raid_bdev1, state offline 00:19:13.912 0 00:19:13.912 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1685695 00:19:13.912 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1685695 ']' 00:19:13.912 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1685695 00:19:13.912 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:13.912 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:13.912 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1685695 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1685695' 00:19:14.172 killing process with pid 1685695 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1685695 00:19:14.172 [2024-07-15 07:54:58.699611] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1685695 00:19:14.172 [2024-07-15 07:54:58.716491] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.zCu2FxKlWL 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:19:14.172 00:19:14.172 real 0m6.050s 00:19:14.172 user 0m9.619s 00:19:14.172 sys 0m0.890s 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:14.172 07:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.172 ************************************ 00:19:14.172 END TEST raid_write_error_test 00:19:14.172 ************************************ 00:19:14.172 07:54:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:14.172 07:54:58 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:14.172 07:54:58 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:19:14.172 07:54:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:14.172 07:54:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:14.172 07:54:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:14.172 ************************************ 00:19:14.172 START TEST raid_state_function_test 00:19:14.172 ************************************ 00:19:14.172 07:54:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:19:14.172 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:19:14.172 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:14.172 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:14.172 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1686720 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1686720' 00:19:14.432 Process raid pid: 1686720 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1686720 /var/tmp/spdk-raid.sock 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1686720 ']' 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:14.432 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:14.432 07:54:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.432 [2024-07-15 07:54:58.997172] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:19:14.432 [2024-07-15 07:54:58.997227] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:14.432 [2024-07-15 07:54:59.088273] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:14.432 [2024-07-15 07:54:59.163758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:14.692 [2024-07-15 07:54:59.203814] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:14.692 [2024-07-15 07:54:59.203835] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:14.952 [2024-07-15 07:54:59.666992] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:14.952 [2024-07-15 07:54:59.667019] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:14.952 [2024-07-15 07:54:59.667025] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:14.952 [2024-07-15 07:54:59.667031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:14.952 [2024-07-15 07:54:59.667036] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:14.952 [2024-07-15 07:54:59.667041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:14.952 [2024-07-15 07:54:59.667046] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:14.952 [2024-07-15 07:54:59.667051] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.952 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.212 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.212 "name": "Existed_Raid", 00:19:15.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.212 "strip_size_kb": 0, 00:19:15.212 "state": "configuring", 00:19:15.212 "raid_level": "raid1", 00:19:15.212 "superblock": false, 00:19:15.212 "num_base_bdevs": 4, 00:19:15.212 "num_base_bdevs_discovered": 0, 00:19:15.212 "num_base_bdevs_operational": 4, 00:19:15.212 "base_bdevs_list": [ 00:19:15.212 { 00:19:15.212 "name": "BaseBdev1", 00:19:15.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.212 "is_configured": false, 00:19:15.212 "data_offset": 0, 00:19:15.212 "data_size": 0 00:19:15.212 }, 00:19:15.212 { 00:19:15.212 "name": "BaseBdev2", 00:19:15.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.212 "is_configured": false, 00:19:15.212 "data_offset": 0, 00:19:15.212 "data_size": 0 00:19:15.212 }, 00:19:15.212 { 00:19:15.212 "name": "BaseBdev3", 00:19:15.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.212 "is_configured": false, 00:19:15.212 "data_offset": 0, 00:19:15.212 "data_size": 0 00:19:15.212 }, 00:19:15.212 { 00:19:15.212 "name": "BaseBdev4", 00:19:15.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.212 "is_configured": false, 00:19:15.212 "data_offset": 0, 00:19:15.212 "data_size": 0 00:19:15.212 } 00:19:15.212 ] 00:19:15.212 }' 00:19:15.212 07:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.212 07:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:15.782 07:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:16.042 [2024-07-15 07:55:00.601247] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:16.042 [2024-07-15 07:55:00.601265] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d46f0 name Existed_Raid, state configuring 00:19:16.042 07:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:16.042 [2024-07-15 07:55:00.761663] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:16.042 [2024-07-15 07:55:00.761679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:16.042 [2024-07-15 07:55:00.761684] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:16.042 [2024-07-15 07:55:00.761689] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:16.042 [2024-07-15 07:55:00.761694] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:16.042 [2024-07-15 07:55:00.761699] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:16.042 [2024-07-15 07:55:00.761704] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:16.042 [2024-07-15 07:55:00.761712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:16.042 07:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:16.302 [2024-07-15 07:55:00.916565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:16.302 BaseBdev1 00:19:16.302 07:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:16.302 07:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:16.302 07:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:16.302 07:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:16.302 07:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:16.302 07:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:16.302 07:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:16.561 07:55:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:16.561 [ 00:19:16.561 { 00:19:16.561 "name": "BaseBdev1", 00:19:16.561 "aliases": [ 00:19:16.561 "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b" 00:19:16.561 ], 00:19:16.561 "product_name": "Malloc disk", 00:19:16.561 "block_size": 512, 00:19:16.561 "num_blocks": 65536, 00:19:16.561 "uuid": "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b", 00:19:16.561 "assigned_rate_limits": { 00:19:16.561 "rw_ios_per_sec": 0, 00:19:16.561 "rw_mbytes_per_sec": 0, 00:19:16.561 "r_mbytes_per_sec": 0, 00:19:16.561 "w_mbytes_per_sec": 0 00:19:16.561 }, 00:19:16.562 "claimed": true, 00:19:16.562 "claim_type": "exclusive_write", 00:19:16.562 "zoned": false, 00:19:16.562 "supported_io_types": { 00:19:16.562 "read": true, 00:19:16.562 "write": true, 00:19:16.562 "unmap": true, 00:19:16.562 "flush": true, 00:19:16.562 "reset": true, 00:19:16.562 "nvme_admin": false, 00:19:16.562 "nvme_io": false, 00:19:16.562 "nvme_io_md": false, 00:19:16.562 "write_zeroes": true, 00:19:16.562 "zcopy": true, 00:19:16.562 "get_zone_info": false, 00:19:16.562 "zone_management": false, 00:19:16.562 "zone_append": false, 00:19:16.562 "compare": false, 00:19:16.562 "compare_and_write": false, 00:19:16.562 "abort": true, 00:19:16.562 "seek_hole": false, 00:19:16.562 "seek_data": false, 00:19:16.562 "copy": true, 00:19:16.562 "nvme_iov_md": false 00:19:16.562 }, 00:19:16.562 "memory_domains": [ 00:19:16.562 { 00:19:16.562 "dma_device_id": "system", 00:19:16.562 "dma_device_type": 1 00:19:16.562 }, 00:19:16.562 { 00:19:16.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.562 "dma_device_type": 2 00:19:16.562 } 00:19:16.562 ], 00:19:16.562 "driver_specific": {} 00:19:16.562 } 00:19:16.562 ] 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.562 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:16.822 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.822 "name": "Existed_Raid", 00:19:16.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.822 "strip_size_kb": 0, 00:19:16.822 "state": "configuring", 00:19:16.822 "raid_level": "raid1", 00:19:16.822 "superblock": false, 00:19:16.822 "num_base_bdevs": 4, 00:19:16.822 "num_base_bdevs_discovered": 1, 00:19:16.822 "num_base_bdevs_operational": 4, 00:19:16.822 "base_bdevs_list": [ 00:19:16.822 { 00:19:16.822 "name": "BaseBdev1", 00:19:16.822 "uuid": "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b", 00:19:16.822 "is_configured": true, 00:19:16.822 "data_offset": 0, 00:19:16.822 "data_size": 65536 00:19:16.822 }, 00:19:16.822 { 00:19:16.822 "name": "BaseBdev2", 00:19:16.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.822 "is_configured": false, 00:19:16.822 "data_offset": 0, 00:19:16.822 "data_size": 0 00:19:16.822 }, 00:19:16.822 { 00:19:16.822 "name": "BaseBdev3", 00:19:16.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.822 "is_configured": false, 00:19:16.822 "data_offset": 0, 00:19:16.822 "data_size": 0 00:19:16.822 }, 00:19:16.822 { 00:19:16.822 "name": "BaseBdev4", 00:19:16.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.822 "is_configured": false, 00:19:16.822 "data_offset": 0, 00:19:16.822 "data_size": 0 00:19:16.822 } 00:19:16.822 ] 00:19:16.822 }' 00:19:16.822 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.822 07:55:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.392 07:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:17.392 [2024-07-15 07:55:02.115592] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:17.392 [2024-07-15 07:55:02.115618] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d3f60 name Existed_Raid, state configuring 00:19:17.392 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:17.652 [2024-07-15 07:55:02.304096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:17.652 [2024-07-15 07:55:02.305186] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:17.652 [2024-07-15 07:55:02.305208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:17.652 [2024-07-15 07:55:02.305214] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:17.652 [2024-07-15 07:55:02.305220] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:17.652 [2024-07-15 07:55:02.305229] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:17.652 [2024-07-15 07:55:02.305234] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.652 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.912 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.912 "name": "Existed_Raid", 00:19:17.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.912 "strip_size_kb": 0, 00:19:17.912 "state": "configuring", 00:19:17.912 "raid_level": "raid1", 00:19:17.912 "superblock": false, 00:19:17.912 "num_base_bdevs": 4, 00:19:17.912 "num_base_bdevs_discovered": 1, 00:19:17.912 "num_base_bdevs_operational": 4, 00:19:17.912 "base_bdevs_list": [ 00:19:17.912 { 00:19:17.912 "name": "BaseBdev1", 00:19:17.912 "uuid": "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b", 00:19:17.912 "is_configured": true, 00:19:17.912 "data_offset": 0, 00:19:17.912 "data_size": 65536 00:19:17.912 }, 00:19:17.912 { 00:19:17.912 "name": "BaseBdev2", 00:19:17.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.912 "is_configured": false, 00:19:17.912 "data_offset": 0, 00:19:17.912 "data_size": 0 00:19:17.912 }, 00:19:17.912 { 00:19:17.912 "name": "BaseBdev3", 00:19:17.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.912 "is_configured": false, 00:19:17.912 "data_offset": 0, 00:19:17.912 "data_size": 0 00:19:17.912 }, 00:19:17.912 { 00:19:17.912 "name": "BaseBdev4", 00:19:17.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.912 "is_configured": false, 00:19:17.912 "data_offset": 0, 00:19:17.912 "data_size": 0 00:19:17.912 } 00:19:17.912 ] 00:19:17.912 }' 00:19:17.912 07:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.912 07:55:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:18.482 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:18.742 [2024-07-15 07:55:03.243259] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:18.742 BaseBdev2 00:19:18.742 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:18.742 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:18.742 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:18.742 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:18.742 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:18.742 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:18.742 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.742 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:19.002 [ 00:19:19.002 { 00:19:19.002 "name": "BaseBdev2", 00:19:19.002 "aliases": [ 00:19:19.002 "38d19bac-a54a-4364-a940-9e82d5e10433" 00:19:19.002 ], 00:19:19.002 "product_name": "Malloc disk", 00:19:19.002 "block_size": 512, 00:19:19.002 "num_blocks": 65536, 00:19:19.002 "uuid": "38d19bac-a54a-4364-a940-9e82d5e10433", 00:19:19.002 "assigned_rate_limits": { 00:19:19.002 "rw_ios_per_sec": 0, 00:19:19.002 "rw_mbytes_per_sec": 0, 00:19:19.002 "r_mbytes_per_sec": 0, 00:19:19.002 "w_mbytes_per_sec": 0 00:19:19.002 }, 00:19:19.002 "claimed": true, 00:19:19.002 "claim_type": "exclusive_write", 00:19:19.002 "zoned": false, 00:19:19.002 "supported_io_types": { 00:19:19.002 "read": true, 00:19:19.002 "write": true, 00:19:19.002 "unmap": true, 00:19:19.002 "flush": true, 00:19:19.002 "reset": true, 00:19:19.002 "nvme_admin": false, 00:19:19.002 "nvme_io": false, 00:19:19.002 "nvme_io_md": false, 00:19:19.002 "write_zeroes": true, 00:19:19.002 "zcopy": true, 00:19:19.002 "get_zone_info": false, 00:19:19.002 "zone_management": false, 00:19:19.002 "zone_append": false, 00:19:19.002 "compare": false, 00:19:19.002 "compare_and_write": false, 00:19:19.002 "abort": true, 00:19:19.002 "seek_hole": false, 00:19:19.002 "seek_data": false, 00:19:19.002 "copy": true, 00:19:19.002 "nvme_iov_md": false 00:19:19.002 }, 00:19:19.002 "memory_domains": [ 00:19:19.002 { 00:19:19.002 "dma_device_id": "system", 00:19:19.002 "dma_device_type": 1 00:19:19.002 }, 00:19:19.002 { 00:19:19.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.002 "dma_device_type": 2 00:19:19.002 } 00:19:19.002 ], 00:19:19.002 "driver_specific": {} 00:19:19.002 } 00:19:19.002 ] 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.002 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.303 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.303 "name": "Existed_Raid", 00:19:19.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.303 "strip_size_kb": 0, 00:19:19.303 "state": "configuring", 00:19:19.303 "raid_level": "raid1", 00:19:19.303 "superblock": false, 00:19:19.303 "num_base_bdevs": 4, 00:19:19.303 "num_base_bdevs_discovered": 2, 00:19:19.303 "num_base_bdevs_operational": 4, 00:19:19.303 "base_bdevs_list": [ 00:19:19.303 { 00:19:19.303 "name": "BaseBdev1", 00:19:19.303 "uuid": "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b", 00:19:19.303 "is_configured": true, 00:19:19.303 "data_offset": 0, 00:19:19.303 "data_size": 65536 00:19:19.303 }, 00:19:19.303 { 00:19:19.303 "name": "BaseBdev2", 00:19:19.303 "uuid": "38d19bac-a54a-4364-a940-9e82d5e10433", 00:19:19.303 "is_configured": true, 00:19:19.303 "data_offset": 0, 00:19:19.303 "data_size": 65536 00:19:19.303 }, 00:19:19.303 { 00:19:19.303 "name": "BaseBdev3", 00:19:19.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.303 "is_configured": false, 00:19:19.303 "data_offset": 0, 00:19:19.303 "data_size": 0 00:19:19.303 }, 00:19:19.303 { 00:19:19.303 "name": "BaseBdev4", 00:19:19.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.303 "is_configured": false, 00:19:19.303 "data_offset": 0, 00:19:19.303 "data_size": 0 00:19:19.303 } 00:19:19.303 ] 00:19:19.303 }' 00:19:19.303 07:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.303 07:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.905 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:19.905 [2024-07-15 07:55:04.575441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:19.905 BaseBdev3 00:19:19.905 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:19.905 07:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:19.905 07:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:19.905 07:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:19.905 07:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:19.905 07:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:19.905 07:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:20.165 07:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:20.425 [ 00:19:20.425 { 00:19:20.425 "name": "BaseBdev3", 00:19:20.425 "aliases": [ 00:19:20.425 "d520739e-cf0e-482a-a11a-09409311126c" 00:19:20.425 ], 00:19:20.425 "product_name": "Malloc disk", 00:19:20.425 "block_size": 512, 00:19:20.425 "num_blocks": 65536, 00:19:20.425 "uuid": "d520739e-cf0e-482a-a11a-09409311126c", 00:19:20.425 "assigned_rate_limits": { 00:19:20.425 "rw_ios_per_sec": 0, 00:19:20.425 "rw_mbytes_per_sec": 0, 00:19:20.425 "r_mbytes_per_sec": 0, 00:19:20.425 "w_mbytes_per_sec": 0 00:19:20.425 }, 00:19:20.425 "claimed": true, 00:19:20.425 "claim_type": "exclusive_write", 00:19:20.425 "zoned": false, 00:19:20.425 "supported_io_types": { 00:19:20.425 "read": true, 00:19:20.425 "write": true, 00:19:20.425 "unmap": true, 00:19:20.425 "flush": true, 00:19:20.425 "reset": true, 00:19:20.425 "nvme_admin": false, 00:19:20.425 "nvme_io": false, 00:19:20.425 "nvme_io_md": false, 00:19:20.425 "write_zeroes": true, 00:19:20.425 "zcopy": true, 00:19:20.425 "get_zone_info": false, 00:19:20.425 "zone_management": false, 00:19:20.425 "zone_append": false, 00:19:20.425 "compare": false, 00:19:20.425 "compare_and_write": false, 00:19:20.425 "abort": true, 00:19:20.425 "seek_hole": false, 00:19:20.425 "seek_data": false, 00:19:20.425 "copy": true, 00:19:20.426 "nvme_iov_md": false 00:19:20.426 }, 00:19:20.426 "memory_domains": [ 00:19:20.426 { 00:19:20.426 "dma_device_id": "system", 00:19:20.426 "dma_device_type": 1 00:19:20.426 }, 00:19:20.426 { 00:19:20.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.426 "dma_device_type": 2 00:19:20.426 } 00:19:20.426 ], 00:19:20.426 "driver_specific": {} 00:19:20.426 } 00:19:20.426 ] 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.426 07:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.426 07:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.426 "name": "Existed_Raid", 00:19:20.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.426 "strip_size_kb": 0, 00:19:20.426 "state": "configuring", 00:19:20.426 "raid_level": "raid1", 00:19:20.426 "superblock": false, 00:19:20.426 "num_base_bdevs": 4, 00:19:20.426 "num_base_bdevs_discovered": 3, 00:19:20.426 "num_base_bdevs_operational": 4, 00:19:20.426 "base_bdevs_list": [ 00:19:20.426 { 00:19:20.426 "name": "BaseBdev1", 00:19:20.426 "uuid": "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b", 00:19:20.426 "is_configured": true, 00:19:20.426 "data_offset": 0, 00:19:20.426 "data_size": 65536 00:19:20.426 }, 00:19:20.426 { 00:19:20.426 "name": "BaseBdev2", 00:19:20.426 "uuid": "38d19bac-a54a-4364-a940-9e82d5e10433", 00:19:20.426 "is_configured": true, 00:19:20.426 "data_offset": 0, 00:19:20.426 "data_size": 65536 00:19:20.426 }, 00:19:20.426 { 00:19:20.426 "name": "BaseBdev3", 00:19:20.426 "uuid": "d520739e-cf0e-482a-a11a-09409311126c", 00:19:20.426 "is_configured": true, 00:19:20.426 "data_offset": 0, 00:19:20.426 "data_size": 65536 00:19:20.426 }, 00:19:20.426 { 00:19:20.426 "name": "BaseBdev4", 00:19:20.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.426 "is_configured": false, 00:19:20.426 "data_offset": 0, 00:19:20.426 "data_size": 0 00:19:20.426 } 00:19:20.426 ] 00:19:20.426 }' 00:19:20.426 07:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.686 07:55:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:21.257 07:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:21.257 [2024-07-15 07:55:05.915619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:21.257 [2024-07-15 07:55:05.915644] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7d4fc0 00:19:21.257 [2024-07-15 07:55:05.915649] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:21.257 [2024-07-15 07:55:05.915817] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7d4c00 00:19:21.257 [2024-07-15 07:55:05.915916] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7d4fc0 00:19:21.257 [2024-07-15 07:55:05.915922] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7d4fc0 00:19:21.257 [2024-07-15 07:55:05.916039] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:21.257 BaseBdev4 00:19:21.257 07:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:21.257 07:55:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:21.257 07:55:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:21.257 07:55:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:21.257 07:55:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:21.257 07:55:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:21.257 07:55:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:21.518 07:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:21.777 [ 00:19:21.777 { 00:19:21.777 "name": "BaseBdev4", 00:19:21.777 "aliases": [ 00:19:21.777 "da3205e2-c4e9-4441-8516-a89427143b58" 00:19:21.777 ], 00:19:21.777 "product_name": "Malloc disk", 00:19:21.777 "block_size": 512, 00:19:21.777 "num_blocks": 65536, 00:19:21.777 "uuid": "da3205e2-c4e9-4441-8516-a89427143b58", 00:19:21.777 "assigned_rate_limits": { 00:19:21.777 "rw_ios_per_sec": 0, 00:19:21.777 "rw_mbytes_per_sec": 0, 00:19:21.777 "r_mbytes_per_sec": 0, 00:19:21.777 "w_mbytes_per_sec": 0 00:19:21.777 }, 00:19:21.777 "claimed": true, 00:19:21.777 "claim_type": "exclusive_write", 00:19:21.777 "zoned": false, 00:19:21.777 "supported_io_types": { 00:19:21.777 "read": true, 00:19:21.777 "write": true, 00:19:21.777 "unmap": true, 00:19:21.777 "flush": true, 00:19:21.777 "reset": true, 00:19:21.777 "nvme_admin": false, 00:19:21.777 "nvme_io": false, 00:19:21.777 "nvme_io_md": false, 00:19:21.777 "write_zeroes": true, 00:19:21.777 "zcopy": true, 00:19:21.777 "get_zone_info": false, 00:19:21.777 "zone_management": false, 00:19:21.777 "zone_append": false, 00:19:21.777 "compare": false, 00:19:21.777 "compare_and_write": false, 00:19:21.777 "abort": true, 00:19:21.777 "seek_hole": false, 00:19:21.777 "seek_data": false, 00:19:21.777 "copy": true, 00:19:21.777 "nvme_iov_md": false 00:19:21.777 }, 00:19:21.777 "memory_domains": [ 00:19:21.777 { 00:19:21.777 "dma_device_id": "system", 00:19:21.777 "dma_device_type": 1 00:19:21.777 }, 00:19:21.777 { 00:19:21.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.777 "dma_device_type": 2 00:19:21.777 } 00:19:21.777 ], 00:19:21.777 "driver_specific": {} 00:19:21.777 } 00:19:21.777 ] 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.777 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.777 "name": "Existed_Raid", 00:19:21.777 "uuid": "a095fa3f-9355-4ceb-b756-e6edc71c1dcb", 00:19:21.777 "strip_size_kb": 0, 00:19:21.777 "state": "online", 00:19:21.777 "raid_level": "raid1", 00:19:21.777 "superblock": false, 00:19:21.777 "num_base_bdevs": 4, 00:19:21.777 "num_base_bdevs_discovered": 4, 00:19:21.777 "num_base_bdevs_operational": 4, 00:19:21.777 "base_bdevs_list": [ 00:19:21.777 { 00:19:21.777 "name": "BaseBdev1", 00:19:21.777 "uuid": "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b", 00:19:21.777 "is_configured": true, 00:19:21.777 "data_offset": 0, 00:19:21.777 "data_size": 65536 00:19:21.777 }, 00:19:21.777 { 00:19:21.777 "name": "BaseBdev2", 00:19:21.777 "uuid": "38d19bac-a54a-4364-a940-9e82d5e10433", 00:19:21.777 "is_configured": true, 00:19:21.777 "data_offset": 0, 00:19:21.777 "data_size": 65536 00:19:21.777 }, 00:19:21.778 { 00:19:21.778 "name": "BaseBdev3", 00:19:21.778 "uuid": "d520739e-cf0e-482a-a11a-09409311126c", 00:19:21.778 "is_configured": true, 00:19:21.778 "data_offset": 0, 00:19:21.778 "data_size": 65536 00:19:21.778 }, 00:19:21.778 { 00:19:21.778 "name": "BaseBdev4", 00:19:21.778 "uuid": "da3205e2-c4e9-4441-8516-a89427143b58", 00:19:21.778 "is_configured": true, 00:19:21.778 "data_offset": 0, 00:19:21.778 "data_size": 65536 00:19:21.778 } 00:19:21.778 ] 00:19:21.778 }' 00:19:21.778 07:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.778 07:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:22.719 [2024-07-15 07:55:07.315442] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:22.719 "name": "Existed_Raid", 00:19:22.719 "aliases": [ 00:19:22.719 "a095fa3f-9355-4ceb-b756-e6edc71c1dcb" 00:19:22.719 ], 00:19:22.719 "product_name": "Raid Volume", 00:19:22.719 "block_size": 512, 00:19:22.719 "num_blocks": 65536, 00:19:22.719 "uuid": "a095fa3f-9355-4ceb-b756-e6edc71c1dcb", 00:19:22.719 "assigned_rate_limits": { 00:19:22.719 "rw_ios_per_sec": 0, 00:19:22.719 "rw_mbytes_per_sec": 0, 00:19:22.719 "r_mbytes_per_sec": 0, 00:19:22.719 "w_mbytes_per_sec": 0 00:19:22.719 }, 00:19:22.719 "claimed": false, 00:19:22.719 "zoned": false, 00:19:22.719 "supported_io_types": { 00:19:22.719 "read": true, 00:19:22.719 "write": true, 00:19:22.719 "unmap": false, 00:19:22.719 "flush": false, 00:19:22.719 "reset": true, 00:19:22.719 "nvme_admin": false, 00:19:22.719 "nvme_io": false, 00:19:22.719 "nvme_io_md": false, 00:19:22.719 "write_zeroes": true, 00:19:22.719 "zcopy": false, 00:19:22.719 "get_zone_info": false, 00:19:22.719 "zone_management": false, 00:19:22.719 "zone_append": false, 00:19:22.719 "compare": false, 00:19:22.719 "compare_and_write": false, 00:19:22.719 "abort": false, 00:19:22.719 "seek_hole": false, 00:19:22.719 "seek_data": false, 00:19:22.719 "copy": false, 00:19:22.719 "nvme_iov_md": false 00:19:22.719 }, 00:19:22.719 "memory_domains": [ 00:19:22.719 { 00:19:22.719 "dma_device_id": "system", 00:19:22.719 "dma_device_type": 1 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.719 "dma_device_type": 2 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "dma_device_id": "system", 00:19:22.719 "dma_device_type": 1 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.719 "dma_device_type": 2 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "dma_device_id": "system", 00:19:22.719 "dma_device_type": 1 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.719 "dma_device_type": 2 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "dma_device_id": "system", 00:19:22.719 "dma_device_type": 1 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.719 "dma_device_type": 2 00:19:22.719 } 00:19:22.719 ], 00:19:22.719 "driver_specific": { 00:19:22.719 "raid": { 00:19:22.719 "uuid": "a095fa3f-9355-4ceb-b756-e6edc71c1dcb", 00:19:22.719 "strip_size_kb": 0, 00:19:22.719 "state": "online", 00:19:22.719 "raid_level": "raid1", 00:19:22.719 "superblock": false, 00:19:22.719 "num_base_bdevs": 4, 00:19:22.719 "num_base_bdevs_discovered": 4, 00:19:22.719 "num_base_bdevs_operational": 4, 00:19:22.719 "base_bdevs_list": [ 00:19:22.719 { 00:19:22.719 "name": "BaseBdev1", 00:19:22.719 "uuid": "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b", 00:19:22.719 "is_configured": true, 00:19:22.719 "data_offset": 0, 00:19:22.719 "data_size": 65536 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "name": "BaseBdev2", 00:19:22.719 "uuid": "38d19bac-a54a-4364-a940-9e82d5e10433", 00:19:22.719 "is_configured": true, 00:19:22.719 "data_offset": 0, 00:19:22.719 "data_size": 65536 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "name": "BaseBdev3", 00:19:22.719 "uuid": "d520739e-cf0e-482a-a11a-09409311126c", 00:19:22.719 "is_configured": true, 00:19:22.719 "data_offset": 0, 00:19:22.719 "data_size": 65536 00:19:22.719 }, 00:19:22.719 { 00:19:22.719 "name": "BaseBdev4", 00:19:22.719 "uuid": "da3205e2-c4e9-4441-8516-a89427143b58", 00:19:22.719 "is_configured": true, 00:19:22.719 "data_offset": 0, 00:19:22.719 "data_size": 65536 00:19:22.719 } 00:19:22.719 ] 00:19:22.719 } 00:19:22.719 } 00:19:22.719 }' 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:22.719 BaseBdev2 00:19:22.719 BaseBdev3 00:19:22.719 BaseBdev4' 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:22.719 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:22.980 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:22.980 "name": "BaseBdev1", 00:19:22.980 "aliases": [ 00:19:22.980 "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b" 00:19:22.980 ], 00:19:22.980 "product_name": "Malloc disk", 00:19:22.980 "block_size": 512, 00:19:22.980 "num_blocks": 65536, 00:19:22.980 "uuid": "e2c19f78-5fdd-442c-aa99-5b1bd76ecb5b", 00:19:22.980 "assigned_rate_limits": { 00:19:22.980 "rw_ios_per_sec": 0, 00:19:22.980 "rw_mbytes_per_sec": 0, 00:19:22.980 "r_mbytes_per_sec": 0, 00:19:22.980 "w_mbytes_per_sec": 0 00:19:22.980 }, 00:19:22.980 "claimed": true, 00:19:22.980 "claim_type": "exclusive_write", 00:19:22.980 "zoned": false, 00:19:22.980 "supported_io_types": { 00:19:22.980 "read": true, 00:19:22.980 "write": true, 00:19:22.980 "unmap": true, 00:19:22.980 "flush": true, 00:19:22.980 "reset": true, 00:19:22.980 "nvme_admin": false, 00:19:22.980 "nvme_io": false, 00:19:22.980 "nvme_io_md": false, 00:19:22.980 "write_zeroes": true, 00:19:22.980 "zcopy": true, 00:19:22.980 "get_zone_info": false, 00:19:22.980 "zone_management": false, 00:19:22.980 "zone_append": false, 00:19:22.980 "compare": false, 00:19:22.980 "compare_and_write": false, 00:19:22.980 "abort": true, 00:19:22.980 "seek_hole": false, 00:19:22.980 "seek_data": false, 00:19:22.980 "copy": true, 00:19:22.980 "nvme_iov_md": false 00:19:22.980 }, 00:19:22.980 "memory_domains": [ 00:19:22.980 { 00:19:22.980 "dma_device_id": "system", 00:19:22.980 "dma_device_type": 1 00:19:22.980 }, 00:19:22.980 { 00:19:22.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.980 "dma_device_type": 2 00:19:22.980 } 00:19:22.980 ], 00:19:22.980 "driver_specific": {} 00:19:22.980 }' 00:19:22.980 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.980 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.980 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:22.980 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.980 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:23.240 07:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:23.810 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:23.810 "name": "BaseBdev2", 00:19:23.810 "aliases": [ 00:19:23.810 "38d19bac-a54a-4364-a940-9e82d5e10433" 00:19:23.810 ], 00:19:23.810 "product_name": "Malloc disk", 00:19:23.810 "block_size": 512, 00:19:23.810 "num_blocks": 65536, 00:19:23.810 "uuid": "38d19bac-a54a-4364-a940-9e82d5e10433", 00:19:23.810 "assigned_rate_limits": { 00:19:23.810 "rw_ios_per_sec": 0, 00:19:23.810 "rw_mbytes_per_sec": 0, 00:19:23.810 "r_mbytes_per_sec": 0, 00:19:23.810 "w_mbytes_per_sec": 0 00:19:23.810 }, 00:19:23.810 "claimed": true, 00:19:23.810 "claim_type": "exclusive_write", 00:19:23.810 "zoned": false, 00:19:23.810 "supported_io_types": { 00:19:23.810 "read": true, 00:19:23.810 "write": true, 00:19:23.810 "unmap": true, 00:19:23.810 "flush": true, 00:19:23.810 "reset": true, 00:19:23.810 "nvme_admin": false, 00:19:23.810 "nvme_io": false, 00:19:23.810 "nvme_io_md": false, 00:19:23.810 "write_zeroes": true, 00:19:23.810 "zcopy": true, 00:19:23.810 "get_zone_info": false, 00:19:23.810 "zone_management": false, 00:19:23.810 "zone_append": false, 00:19:23.810 "compare": false, 00:19:23.810 "compare_and_write": false, 00:19:23.810 "abort": true, 00:19:23.810 "seek_hole": false, 00:19:23.810 "seek_data": false, 00:19:23.810 "copy": true, 00:19:23.810 "nvme_iov_md": false 00:19:23.810 }, 00:19:23.810 "memory_domains": [ 00:19:23.810 { 00:19:23.810 "dma_device_id": "system", 00:19:23.810 "dma_device_type": 1 00:19:23.810 }, 00:19:23.810 { 00:19:23.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.810 "dma_device_type": 2 00:19:23.810 } 00:19:23.810 ], 00:19:23.810 "driver_specific": {} 00:19:23.810 }' 00:19:23.810 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.810 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.072 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.338 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.338 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:24.338 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:24.338 07:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.338 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.338 "name": "BaseBdev3", 00:19:24.338 "aliases": [ 00:19:24.338 "d520739e-cf0e-482a-a11a-09409311126c" 00:19:24.338 ], 00:19:24.338 "product_name": "Malloc disk", 00:19:24.338 "block_size": 512, 00:19:24.338 "num_blocks": 65536, 00:19:24.338 "uuid": "d520739e-cf0e-482a-a11a-09409311126c", 00:19:24.338 "assigned_rate_limits": { 00:19:24.338 "rw_ios_per_sec": 0, 00:19:24.338 "rw_mbytes_per_sec": 0, 00:19:24.338 "r_mbytes_per_sec": 0, 00:19:24.338 "w_mbytes_per_sec": 0 00:19:24.338 }, 00:19:24.338 "claimed": true, 00:19:24.338 "claim_type": "exclusive_write", 00:19:24.338 "zoned": false, 00:19:24.338 "supported_io_types": { 00:19:24.338 "read": true, 00:19:24.338 "write": true, 00:19:24.338 "unmap": true, 00:19:24.338 "flush": true, 00:19:24.338 "reset": true, 00:19:24.338 "nvme_admin": false, 00:19:24.338 "nvme_io": false, 00:19:24.338 "nvme_io_md": false, 00:19:24.338 "write_zeroes": true, 00:19:24.338 "zcopy": true, 00:19:24.338 "get_zone_info": false, 00:19:24.338 "zone_management": false, 00:19:24.338 "zone_append": false, 00:19:24.338 "compare": false, 00:19:24.338 "compare_and_write": false, 00:19:24.338 "abort": true, 00:19:24.338 "seek_hole": false, 00:19:24.338 "seek_data": false, 00:19:24.338 "copy": true, 00:19:24.338 "nvme_iov_md": false 00:19:24.338 }, 00:19:24.338 "memory_domains": [ 00:19:24.338 { 00:19:24.338 "dma_device_id": "system", 00:19:24.338 "dma_device_type": 1 00:19:24.338 }, 00:19:24.338 { 00:19:24.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.338 "dma_device_type": 2 00:19:24.338 } 00:19:24.338 ], 00:19:24.338 "driver_specific": {} 00:19:24.338 }' 00:19:24.338 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.338 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.600 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.859 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.859 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:24.859 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:24.859 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.859 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.859 "name": "BaseBdev4", 00:19:24.859 "aliases": [ 00:19:24.859 "da3205e2-c4e9-4441-8516-a89427143b58" 00:19:24.859 ], 00:19:24.859 "product_name": "Malloc disk", 00:19:24.859 "block_size": 512, 00:19:24.859 "num_blocks": 65536, 00:19:24.859 "uuid": "da3205e2-c4e9-4441-8516-a89427143b58", 00:19:24.859 "assigned_rate_limits": { 00:19:24.859 "rw_ios_per_sec": 0, 00:19:24.859 "rw_mbytes_per_sec": 0, 00:19:24.859 "r_mbytes_per_sec": 0, 00:19:24.859 "w_mbytes_per_sec": 0 00:19:24.859 }, 00:19:24.859 "claimed": true, 00:19:24.859 "claim_type": "exclusive_write", 00:19:24.859 "zoned": false, 00:19:24.859 "supported_io_types": { 00:19:24.859 "read": true, 00:19:24.859 "write": true, 00:19:24.859 "unmap": true, 00:19:24.859 "flush": true, 00:19:24.859 "reset": true, 00:19:24.859 "nvme_admin": false, 00:19:24.859 "nvme_io": false, 00:19:24.859 "nvme_io_md": false, 00:19:24.859 "write_zeroes": true, 00:19:24.859 "zcopy": true, 00:19:24.859 "get_zone_info": false, 00:19:24.859 "zone_management": false, 00:19:24.859 "zone_append": false, 00:19:24.859 "compare": false, 00:19:24.859 "compare_and_write": false, 00:19:24.859 "abort": true, 00:19:24.859 "seek_hole": false, 00:19:24.859 "seek_data": false, 00:19:24.859 "copy": true, 00:19:24.859 "nvme_iov_md": false 00:19:24.859 }, 00:19:24.859 "memory_domains": [ 00:19:24.859 { 00:19:24.859 "dma_device_id": "system", 00:19:24.859 "dma_device_type": 1 00:19:24.859 }, 00:19:24.859 { 00:19:24.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.859 "dma_device_type": 2 00:19:24.859 } 00:19:24.859 ], 00:19:24.859 "driver_specific": {} 00:19:24.859 }' 00:19:24.859 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.859 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.118 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:25.119 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.119 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.119 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:25.119 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.119 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.119 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.119 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.119 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.378 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.378 07:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:25.378 [2024-07-15 07:55:10.054601] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.378 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.638 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.638 "name": "Existed_Raid", 00:19:25.638 "uuid": "a095fa3f-9355-4ceb-b756-e6edc71c1dcb", 00:19:25.638 "strip_size_kb": 0, 00:19:25.638 "state": "online", 00:19:25.638 "raid_level": "raid1", 00:19:25.638 "superblock": false, 00:19:25.638 "num_base_bdevs": 4, 00:19:25.638 "num_base_bdevs_discovered": 3, 00:19:25.638 "num_base_bdevs_operational": 3, 00:19:25.638 "base_bdevs_list": [ 00:19:25.638 { 00:19:25.638 "name": null, 00:19:25.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.638 "is_configured": false, 00:19:25.638 "data_offset": 0, 00:19:25.638 "data_size": 65536 00:19:25.638 }, 00:19:25.638 { 00:19:25.638 "name": "BaseBdev2", 00:19:25.638 "uuid": "38d19bac-a54a-4364-a940-9e82d5e10433", 00:19:25.638 "is_configured": true, 00:19:25.638 "data_offset": 0, 00:19:25.638 "data_size": 65536 00:19:25.638 }, 00:19:25.638 { 00:19:25.638 "name": "BaseBdev3", 00:19:25.638 "uuid": "d520739e-cf0e-482a-a11a-09409311126c", 00:19:25.638 "is_configured": true, 00:19:25.638 "data_offset": 0, 00:19:25.638 "data_size": 65536 00:19:25.638 }, 00:19:25.638 { 00:19:25.638 "name": "BaseBdev4", 00:19:25.638 "uuid": "da3205e2-c4e9-4441-8516-a89427143b58", 00:19:25.638 "is_configured": true, 00:19:25.638 "data_offset": 0, 00:19:25.638 "data_size": 65536 00:19:25.638 } 00:19:25.638 ] 00:19:25.638 }' 00:19:25.638 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.638 07:55:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:26.207 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:26.207 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:26.207 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.207 07:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:26.467 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:26.467 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:26.467 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:26.467 [2024-07-15 07:55:11.189431] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:26.467 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:26.467 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:26.467 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.467 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:26.727 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:26.727 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:26.727 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:26.986 [2024-07-15 07:55:11.572179] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:26.986 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:26.986 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:26.986 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.986 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:27.245 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:27.245 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:27.245 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:27.245 [2024-07-15 07:55:11.962904] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:27.245 [2024-07-15 07:55:11.962957] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:27.245 [2024-07-15 07:55:11.968933] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:27.245 [2024-07-15 07:55:11.968957] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:27.245 [2024-07-15 07:55:11.968962] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d4fc0 name Existed_Raid, state offline 00:19:27.245 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:27.245 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:27.245 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.245 07:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:27.504 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:27.504 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:27.504 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:27.504 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:27.504 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:27.504 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:27.764 BaseBdev2 00:19:27.764 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:27.764 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:27.764 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:27.764 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:27.764 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:27.764 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:27.764 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:27.764 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:28.026 [ 00:19:28.026 { 00:19:28.026 "name": "BaseBdev2", 00:19:28.026 "aliases": [ 00:19:28.026 "b91c4cf8-7124-485c-9e88-de2f2aaf0c99" 00:19:28.026 ], 00:19:28.026 "product_name": "Malloc disk", 00:19:28.026 "block_size": 512, 00:19:28.026 "num_blocks": 65536, 00:19:28.026 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:28.026 "assigned_rate_limits": { 00:19:28.026 "rw_ios_per_sec": 0, 00:19:28.026 "rw_mbytes_per_sec": 0, 00:19:28.026 "r_mbytes_per_sec": 0, 00:19:28.026 "w_mbytes_per_sec": 0 00:19:28.026 }, 00:19:28.026 "claimed": false, 00:19:28.026 "zoned": false, 00:19:28.026 "supported_io_types": { 00:19:28.026 "read": true, 00:19:28.026 "write": true, 00:19:28.026 "unmap": true, 00:19:28.026 "flush": true, 00:19:28.026 "reset": true, 00:19:28.026 "nvme_admin": false, 00:19:28.026 "nvme_io": false, 00:19:28.026 "nvme_io_md": false, 00:19:28.026 "write_zeroes": true, 00:19:28.026 "zcopy": true, 00:19:28.026 "get_zone_info": false, 00:19:28.026 "zone_management": false, 00:19:28.026 "zone_append": false, 00:19:28.026 "compare": false, 00:19:28.026 "compare_and_write": false, 00:19:28.026 "abort": true, 00:19:28.026 "seek_hole": false, 00:19:28.026 "seek_data": false, 00:19:28.026 "copy": true, 00:19:28.026 "nvme_iov_md": false 00:19:28.026 }, 00:19:28.026 "memory_domains": [ 00:19:28.026 { 00:19:28.026 "dma_device_id": "system", 00:19:28.026 "dma_device_type": 1 00:19:28.026 }, 00:19:28.026 { 00:19:28.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.026 "dma_device_type": 2 00:19:28.026 } 00:19:28.026 ], 00:19:28.026 "driver_specific": {} 00:19:28.026 } 00:19:28.026 ] 00:19:28.026 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:28.026 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:28.026 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:28.026 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:28.287 BaseBdev3 00:19:28.287 07:55:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:28.287 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:28.287 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:28.287 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:28.287 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:28.287 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:28.287 07:55:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:28.547 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:28.547 [ 00:19:28.547 { 00:19:28.547 "name": "BaseBdev3", 00:19:28.547 "aliases": [ 00:19:28.547 "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439" 00:19:28.547 ], 00:19:28.547 "product_name": "Malloc disk", 00:19:28.547 "block_size": 512, 00:19:28.547 "num_blocks": 65536, 00:19:28.547 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:28.547 "assigned_rate_limits": { 00:19:28.547 "rw_ios_per_sec": 0, 00:19:28.547 "rw_mbytes_per_sec": 0, 00:19:28.547 "r_mbytes_per_sec": 0, 00:19:28.547 "w_mbytes_per_sec": 0 00:19:28.547 }, 00:19:28.547 "claimed": false, 00:19:28.547 "zoned": false, 00:19:28.547 "supported_io_types": { 00:19:28.547 "read": true, 00:19:28.547 "write": true, 00:19:28.547 "unmap": true, 00:19:28.547 "flush": true, 00:19:28.547 "reset": true, 00:19:28.547 "nvme_admin": false, 00:19:28.547 "nvme_io": false, 00:19:28.547 "nvme_io_md": false, 00:19:28.547 "write_zeroes": true, 00:19:28.547 "zcopy": true, 00:19:28.547 "get_zone_info": false, 00:19:28.547 "zone_management": false, 00:19:28.547 "zone_append": false, 00:19:28.547 "compare": false, 00:19:28.547 "compare_and_write": false, 00:19:28.547 "abort": true, 00:19:28.547 "seek_hole": false, 00:19:28.547 "seek_data": false, 00:19:28.547 "copy": true, 00:19:28.547 "nvme_iov_md": false 00:19:28.547 }, 00:19:28.547 "memory_domains": [ 00:19:28.547 { 00:19:28.547 "dma_device_id": "system", 00:19:28.547 "dma_device_type": 1 00:19:28.547 }, 00:19:28.547 { 00:19:28.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.547 "dma_device_type": 2 00:19:28.547 } 00:19:28.547 ], 00:19:28.547 "driver_specific": {} 00:19:28.547 } 00:19:28.547 ] 00:19:28.547 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:28.547 07:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:28.547 07:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:28.547 07:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:28.807 BaseBdev4 00:19:28.807 07:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:28.807 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:28.807 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:28.807 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:28.808 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:28.808 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:28.808 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:29.068 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:29.328 [ 00:19:29.328 { 00:19:29.328 "name": "BaseBdev4", 00:19:29.328 "aliases": [ 00:19:29.328 "2628d0c6-e214-4579-8ce5-cecb07c1e27f" 00:19:29.328 ], 00:19:29.328 "product_name": "Malloc disk", 00:19:29.328 "block_size": 512, 00:19:29.328 "num_blocks": 65536, 00:19:29.328 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:29.328 "assigned_rate_limits": { 00:19:29.328 "rw_ios_per_sec": 0, 00:19:29.328 "rw_mbytes_per_sec": 0, 00:19:29.328 "r_mbytes_per_sec": 0, 00:19:29.328 "w_mbytes_per_sec": 0 00:19:29.328 }, 00:19:29.328 "claimed": false, 00:19:29.328 "zoned": false, 00:19:29.328 "supported_io_types": { 00:19:29.328 "read": true, 00:19:29.328 "write": true, 00:19:29.328 "unmap": true, 00:19:29.328 "flush": true, 00:19:29.328 "reset": true, 00:19:29.328 "nvme_admin": false, 00:19:29.328 "nvme_io": false, 00:19:29.328 "nvme_io_md": false, 00:19:29.328 "write_zeroes": true, 00:19:29.328 "zcopy": true, 00:19:29.328 "get_zone_info": false, 00:19:29.328 "zone_management": false, 00:19:29.328 "zone_append": false, 00:19:29.328 "compare": false, 00:19:29.328 "compare_and_write": false, 00:19:29.328 "abort": true, 00:19:29.328 "seek_hole": false, 00:19:29.328 "seek_data": false, 00:19:29.328 "copy": true, 00:19:29.328 "nvme_iov_md": false 00:19:29.328 }, 00:19:29.328 "memory_domains": [ 00:19:29.328 { 00:19:29.328 "dma_device_id": "system", 00:19:29.328 "dma_device_type": 1 00:19:29.328 }, 00:19:29.328 { 00:19:29.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.328 "dma_device_type": 2 00:19:29.328 } 00:19:29.328 ], 00:19:29.328 "driver_specific": {} 00:19:29.328 } 00:19:29.328 ] 00:19:29.328 07:55:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:29.328 07:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:29.328 07:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:29.328 07:55:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:29.328 [2024-07-15 07:55:14.010019] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:29.328 [2024-07-15 07:55:14.010048] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:29.328 [2024-07-15 07:55:14.010062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:29.328 [2024-07-15 07:55:14.011099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:29.328 [2024-07-15 07:55:14.011131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.328 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:29.588 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.588 "name": "Existed_Raid", 00:19:29.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.588 "strip_size_kb": 0, 00:19:29.588 "state": "configuring", 00:19:29.588 "raid_level": "raid1", 00:19:29.588 "superblock": false, 00:19:29.588 "num_base_bdevs": 4, 00:19:29.588 "num_base_bdevs_discovered": 3, 00:19:29.588 "num_base_bdevs_operational": 4, 00:19:29.588 "base_bdevs_list": [ 00:19:29.588 { 00:19:29.588 "name": "BaseBdev1", 00:19:29.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.588 "is_configured": false, 00:19:29.588 "data_offset": 0, 00:19:29.588 "data_size": 0 00:19:29.588 }, 00:19:29.588 { 00:19:29.588 "name": "BaseBdev2", 00:19:29.588 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:29.588 "is_configured": true, 00:19:29.588 "data_offset": 0, 00:19:29.588 "data_size": 65536 00:19:29.588 }, 00:19:29.588 { 00:19:29.588 "name": "BaseBdev3", 00:19:29.588 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:29.588 "is_configured": true, 00:19:29.588 "data_offset": 0, 00:19:29.588 "data_size": 65536 00:19:29.588 }, 00:19:29.588 { 00:19:29.588 "name": "BaseBdev4", 00:19:29.588 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:29.588 "is_configured": true, 00:19:29.588 "data_offset": 0, 00:19:29.588 "data_size": 65536 00:19:29.588 } 00:19:29.588 ] 00:19:29.588 }' 00:19:29.588 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.588 07:55:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.525 07:55:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:30.525 [2024-07-15 07:55:15.116793] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.525 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.094 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.094 "name": "Existed_Raid", 00:19:31.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.094 "strip_size_kb": 0, 00:19:31.094 "state": "configuring", 00:19:31.094 "raid_level": "raid1", 00:19:31.094 "superblock": false, 00:19:31.094 "num_base_bdevs": 4, 00:19:31.094 "num_base_bdevs_discovered": 2, 00:19:31.094 "num_base_bdevs_operational": 4, 00:19:31.094 "base_bdevs_list": [ 00:19:31.094 { 00:19:31.094 "name": "BaseBdev1", 00:19:31.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.094 "is_configured": false, 00:19:31.094 "data_offset": 0, 00:19:31.094 "data_size": 0 00:19:31.094 }, 00:19:31.094 { 00:19:31.094 "name": null, 00:19:31.094 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:31.094 "is_configured": false, 00:19:31.094 "data_offset": 0, 00:19:31.094 "data_size": 65536 00:19:31.094 }, 00:19:31.094 { 00:19:31.094 "name": "BaseBdev3", 00:19:31.094 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:31.094 "is_configured": true, 00:19:31.094 "data_offset": 0, 00:19:31.094 "data_size": 65536 00:19:31.094 }, 00:19:31.094 { 00:19:31.094 "name": "BaseBdev4", 00:19:31.094 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:31.094 "is_configured": true, 00:19:31.094 "data_offset": 0, 00:19:31.094 "data_size": 65536 00:19:31.094 } 00:19:31.094 ] 00:19:31.094 }' 00:19:31.094 07:55:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.094 07:55:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:31.662 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.662 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:31.662 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:31.662 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:31.922 [2024-07-15 07:55:16.557306] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:31.922 BaseBdev1 00:19:31.922 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:31.922 07:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:31.922 07:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:31.922 07:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:31.922 07:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:31.922 07:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:31.922 07:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:32.181 [ 00:19:32.181 { 00:19:32.181 "name": "BaseBdev1", 00:19:32.181 "aliases": [ 00:19:32.181 "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b" 00:19:32.181 ], 00:19:32.181 "product_name": "Malloc disk", 00:19:32.181 "block_size": 512, 00:19:32.181 "num_blocks": 65536, 00:19:32.181 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:32.181 "assigned_rate_limits": { 00:19:32.181 "rw_ios_per_sec": 0, 00:19:32.181 "rw_mbytes_per_sec": 0, 00:19:32.181 "r_mbytes_per_sec": 0, 00:19:32.181 "w_mbytes_per_sec": 0 00:19:32.181 }, 00:19:32.181 "claimed": true, 00:19:32.181 "claim_type": "exclusive_write", 00:19:32.181 "zoned": false, 00:19:32.181 "supported_io_types": { 00:19:32.181 "read": true, 00:19:32.181 "write": true, 00:19:32.181 "unmap": true, 00:19:32.181 "flush": true, 00:19:32.181 "reset": true, 00:19:32.181 "nvme_admin": false, 00:19:32.181 "nvme_io": false, 00:19:32.181 "nvme_io_md": false, 00:19:32.181 "write_zeroes": true, 00:19:32.181 "zcopy": true, 00:19:32.181 "get_zone_info": false, 00:19:32.181 "zone_management": false, 00:19:32.181 "zone_append": false, 00:19:32.181 "compare": false, 00:19:32.181 "compare_and_write": false, 00:19:32.181 "abort": true, 00:19:32.181 "seek_hole": false, 00:19:32.181 "seek_data": false, 00:19:32.181 "copy": true, 00:19:32.181 "nvme_iov_md": false 00:19:32.181 }, 00:19:32.181 "memory_domains": [ 00:19:32.181 { 00:19:32.181 "dma_device_id": "system", 00:19:32.181 "dma_device_type": 1 00:19:32.181 }, 00:19:32.181 { 00:19:32.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.181 "dma_device_type": 2 00:19:32.181 } 00:19:32.181 ], 00:19:32.181 "driver_specific": {} 00:19:32.181 } 00:19:32.181 ] 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.181 07:55:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.442 07:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.442 "name": "Existed_Raid", 00:19:32.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.442 "strip_size_kb": 0, 00:19:32.442 "state": "configuring", 00:19:32.442 "raid_level": "raid1", 00:19:32.442 "superblock": false, 00:19:32.442 "num_base_bdevs": 4, 00:19:32.442 "num_base_bdevs_discovered": 3, 00:19:32.442 "num_base_bdevs_operational": 4, 00:19:32.442 "base_bdevs_list": [ 00:19:32.442 { 00:19:32.442 "name": "BaseBdev1", 00:19:32.442 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:32.442 "is_configured": true, 00:19:32.442 "data_offset": 0, 00:19:32.442 "data_size": 65536 00:19:32.442 }, 00:19:32.442 { 00:19:32.442 "name": null, 00:19:32.442 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:32.442 "is_configured": false, 00:19:32.442 "data_offset": 0, 00:19:32.442 "data_size": 65536 00:19:32.442 }, 00:19:32.442 { 00:19:32.442 "name": "BaseBdev3", 00:19:32.442 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:32.442 "is_configured": true, 00:19:32.442 "data_offset": 0, 00:19:32.442 "data_size": 65536 00:19:32.442 }, 00:19:32.442 { 00:19:32.442 "name": "BaseBdev4", 00:19:32.442 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:32.442 "is_configured": true, 00:19:32.442 "data_offset": 0, 00:19:32.442 "data_size": 65536 00:19:32.442 } 00:19:32.442 ] 00:19:32.442 }' 00:19:32.442 07:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.442 07:55:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.013 07:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.013 07:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:33.293 07:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:33.293 07:55:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:33.553 [2024-07-15 07:55:18.073147] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.553 "name": "Existed_Raid", 00:19:33.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.553 "strip_size_kb": 0, 00:19:33.553 "state": "configuring", 00:19:33.553 "raid_level": "raid1", 00:19:33.553 "superblock": false, 00:19:33.553 "num_base_bdevs": 4, 00:19:33.553 "num_base_bdevs_discovered": 2, 00:19:33.553 "num_base_bdevs_operational": 4, 00:19:33.553 "base_bdevs_list": [ 00:19:33.553 { 00:19:33.553 "name": "BaseBdev1", 00:19:33.553 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:33.553 "is_configured": true, 00:19:33.553 "data_offset": 0, 00:19:33.553 "data_size": 65536 00:19:33.553 }, 00:19:33.553 { 00:19:33.553 "name": null, 00:19:33.553 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:33.553 "is_configured": false, 00:19:33.553 "data_offset": 0, 00:19:33.553 "data_size": 65536 00:19:33.553 }, 00:19:33.553 { 00:19:33.553 "name": null, 00:19:33.553 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:33.553 "is_configured": false, 00:19:33.553 "data_offset": 0, 00:19:33.553 "data_size": 65536 00:19:33.553 }, 00:19:33.553 { 00:19:33.553 "name": "BaseBdev4", 00:19:33.553 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:33.553 "is_configured": true, 00:19:33.553 "data_offset": 0, 00:19:33.553 "data_size": 65536 00:19:33.553 } 00:19:33.553 ] 00:19:33.553 }' 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.553 07:55:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.191 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.191 07:55:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:34.451 [2024-07-15 07:55:19.179965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.451 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.452 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.711 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.711 "name": "Existed_Raid", 00:19:34.711 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.711 "strip_size_kb": 0, 00:19:34.711 "state": "configuring", 00:19:34.711 "raid_level": "raid1", 00:19:34.711 "superblock": false, 00:19:34.711 "num_base_bdevs": 4, 00:19:34.711 "num_base_bdevs_discovered": 3, 00:19:34.711 "num_base_bdevs_operational": 4, 00:19:34.711 "base_bdevs_list": [ 00:19:34.711 { 00:19:34.711 "name": "BaseBdev1", 00:19:34.711 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:34.711 "is_configured": true, 00:19:34.711 "data_offset": 0, 00:19:34.711 "data_size": 65536 00:19:34.711 }, 00:19:34.711 { 00:19:34.711 "name": null, 00:19:34.711 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:34.711 "is_configured": false, 00:19:34.711 "data_offset": 0, 00:19:34.711 "data_size": 65536 00:19:34.711 }, 00:19:34.711 { 00:19:34.711 "name": "BaseBdev3", 00:19:34.711 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:34.711 "is_configured": true, 00:19:34.711 "data_offset": 0, 00:19:34.711 "data_size": 65536 00:19:34.711 }, 00:19:34.711 { 00:19:34.711 "name": "BaseBdev4", 00:19:34.711 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:34.711 "is_configured": true, 00:19:34.711 "data_offset": 0, 00:19:34.711 "data_size": 65536 00:19:34.711 } 00:19:34.711 ] 00:19:34.711 }' 00:19:34.711 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.711 07:55:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:35.280 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.280 07:55:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:35.539 [2024-07-15 07:55:20.251091] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.539 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.799 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.799 "name": "Existed_Raid", 00:19:35.799 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.799 "strip_size_kb": 0, 00:19:35.799 "state": "configuring", 00:19:35.799 "raid_level": "raid1", 00:19:35.799 "superblock": false, 00:19:35.799 "num_base_bdevs": 4, 00:19:35.799 "num_base_bdevs_discovered": 2, 00:19:35.799 "num_base_bdevs_operational": 4, 00:19:35.799 "base_bdevs_list": [ 00:19:35.799 { 00:19:35.799 "name": null, 00:19:35.799 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:35.799 "is_configured": false, 00:19:35.799 "data_offset": 0, 00:19:35.799 "data_size": 65536 00:19:35.799 }, 00:19:35.799 { 00:19:35.799 "name": null, 00:19:35.799 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:35.799 "is_configured": false, 00:19:35.799 "data_offset": 0, 00:19:35.799 "data_size": 65536 00:19:35.799 }, 00:19:35.799 { 00:19:35.799 "name": "BaseBdev3", 00:19:35.799 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:35.799 "is_configured": true, 00:19:35.799 "data_offset": 0, 00:19:35.799 "data_size": 65536 00:19:35.799 }, 00:19:35.799 { 00:19:35.799 "name": "BaseBdev4", 00:19:35.799 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:35.799 "is_configured": true, 00:19:35.799 "data_offset": 0, 00:19:35.799 "data_size": 65536 00:19:35.799 } 00:19:35.799 ] 00:19:35.799 }' 00:19:35.799 07:55:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.799 07:55:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.370 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.370 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:36.630 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:36.630 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:36.890 [2024-07-15 07:55:21.471898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.890 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:37.151 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.151 "name": "Existed_Raid", 00:19:37.151 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.151 "strip_size_kb": 0, 00:19:37.151 "state": "configuring", 00:19:37.151 "raid_level": "raid1", 00:19:37.151 "superblock": false, 00:19:37.151 "num_base_bdevs": 4, 00:19:37.151 "num_base_bdevs_discovered": 3, 00:19:37.151 "num_base_bdevs_operational": 4, 00:19:37.151 "base_bdevs_list": [ 00:19:37.151 { 00:19:37.151 "name": null, 00:19:37.151 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:37.151 "is_configured": false, 00:19:37.151 "data_offset": 0, 00:19:37.151 "data_size": 65536 00:19:37.151 }, 00:19:37.151 { 00:19:37.151 "name": "BaseBdev2", 00:19:37.151 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:37.151 "is_configured": true, 00:19:37.151 "data_offset": 0, 00:19:37.151 "data_size": 65536 00:19:37.151 }, 00:19:37.151 { 00:19:37.151 "name": "BaseBdev3", 00:19:37.151 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:37.151 "is_configured": true, 00:19:37.151 "data_offset": 0, 00:19:37.151 "data_size": 65536 00:19:37.151 }, 00:19:37.151 { 00:19:37.151 "name": "BaseBdev4", 00:19:37.151 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:37.151 "is_configured": true, 00:19:37.151 "data_offset": 0, 00:19:37.151 "data_size": 65536 00:19:37.151 } 00:19:37.151 ] 00:19:37.151 }' 00:19:37.151 07:55:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.151 07:55:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.721 07:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.721 07:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:37.981 07:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:37.981 07:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.981 07:55:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:38.552 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b 00:19:38.552 [2024-07-15 07:55:23.209292] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:38.552 [2024-07-15 07:55:23.209318] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7dc1a0 00:19:38.552 [2024-07-15 07:55:23.209322] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:38.552 [2024-07-15 07:55:23.209468] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7db5e0 00:19:38.552 [2024-07-15 07:55:23.209565] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7dc1a0 00:19:38.552 [2024-07-15 07:55:23.209571] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7dc1a0 00:19:38.552 [2024-07-15 07:55:23.209691] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:38.552 NewBaseBdev 00:19:38.552 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:38.552 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:38.552 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:38.552 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:38.552 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:38.552 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:38.552 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:38.812 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:39.071 [ 00:19:39.071 { 00:19:39.071 "name": "NewBaseBdev", 00:19:39.071 "aliases": [ 00:19:39.071 "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b" 00:19:39.071 ], 00:19:39.071 "product_name": "Malloc disk", 00:19:39.071 "block_size": 512, 00:19:39.071 "num_blocks": 65536, 00:19:39.071 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:39.071 "assigned_rate_limits": { 00:19:39.071 "rw_ios_per_sec": 0, 00:19:39.071 "rw_mbytes_per_sec": 0, 00:19:39.071 "r_mbytes_per_sec": 0, 00:19:39.071 "w_mbytes_per_sec": 0 00:19:39.071 }, 00:19:39.071 "claimed": true, 00:19:39.071 "claim_type": "exclusive_write", 00:19:39.071 "zoned": false, 00:19:39.071 "supported_io_types": { 00:19:39.071 "read": true, 00:19:39.071 "write": true, 00:19:39.071 "unmap": true, 00:19:39.071 "flush": true, 00:19:39.071 "reset": true, 00:19:39.071 "nvme_admin": false, 00:19:39.071 "nvme_io": false, 00:19:39.071 "nvme_io_md": false, 00:19:39.071 "write_zeroes": true, 00:19:39.071 "zcopy": true, 00:19:39.071 "get_zone_info": false, 00:19:39.071 "zone_management": false, 00:19:39.071 "zone_append": false, 00:19:39.071 "compare": false, 00:19:39.071 "compare_and_write": false, 00:19:39.071 "abort": true, 00:19:39.071 "seek_hole": false, 00:19:39.071 "seek_data": false, 00:19:39.071 "copy": true, 00:19:39.071 "nvme_iov_md": false 00:19:39.071 }, 00:19:39.071 "memory_domains": [ 00:19:39.071 { 00:19:39.071 "dma_device_id": "system", 00:19:39.071 "dma_device_type": 1 00:19:39.071 }, 00:19:39.071 { 00:19:39.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.071 "dma_device_type": 2 00:19:39.071 } 00:19:39.071 ], 00:19:39.071 "driver_specific": {} 00:19:39.071 } 00:19:39.071 ] 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.071 "name": "Existed_Raid", 00:19:39.071 "uuid": "ce7180d4-734a-4924-9ffa-fe5cb871eb89", 00:19:39.071 "strip_size_kb": 0, 00:19:39.071 "state": "online", 00:19:39.071 "raid_level": "raid1", 00:19:39.071 "superblock": false, 00:19:39.071 "num_base_bdevs": 4, 00:19:39.071 "num_base_bdevs_discovered": 4, 00:19:39.071 "num_base_bdevs_operational": 4, 00:19:39.071 "base_bdevs_list": [ 00:19:39.071 { 00:19:39.071 "name": "NewBaseBdev", 00:19:39.071 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:39.071 "is_configured": true, 00:19:39.071 "data_offset": 0, 00:19:39.071 "data_size": 65536 00:19:39.071 }, 00:19:39.071 { 00:19:39.071 "name": "BaseBdev2", 00:19:39.071 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:39.071 "is_configured": true, 00:19:39.071 "data_offset": 0, 00:19:39.071 "data_size": 65536 00:19:39.071 }, 00:19:39.071 { 00:19:39.071 "name": "BaseBdev3", 00:19:39.071 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:39.071 "is_configured": true, 00:19:39.071 "data_offset": 0, 00:19:39.071 "data_size": 65536 00:19:39.071 }, 00:19:39.071 { 00:19:39.071 "name": "BaseBdev4", 00:19:39.071 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:39.071 "is_configured": true, 00:19:39.071 "data_offset": 0, 00:19:39.071 "data_size": 65536 00:19:39.071 } 00:19:39.071 ] 00:19:39.071 }' 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.071 07:55:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:40.012 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:40.012 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:40.012 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:40.012 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:40.012 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:40.013 [2024-07-15 07:55:24.577018] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:40.013 "name": "Existed_Raid", 00:19:40.013 "aliases": [ 00:19:40.013 "ce7180d4-734a-4924-9ffa-fe5cb871eb89" 00:19:40.013 ], 00:19:40.013 "product_name": "Raid Volume", 00:19:40.013 "block_size": 512, 00:19:40.013 "num_blocks": 65536, 00:19:40.013 "uuid": "ce7180d4-734a-4924-9ffa-fe5cb871eb89", 00:19:40.013 "assigned_rate_limits": { 00:19:40.013 "rw_ios_per_sec": 0, 00:19:40.013 "rw_mbytes_per_sec": 0, 00:19:40.013 "r_mbytes_per_sec": 0, 00:19:40.013 "w_mbytes_per_sec": 0 00:19:40.013 }, 00:19:40.013 "claimed": false, 00:19:40.013 "zoned": false, 00:19:40.013 "supported_io_types": { 00:19:40.013 "read": true, 00:19:40.013 "write": true, 00:19:40.013 "unmap": false, 00:19:40.013 "flush": false, 00:19:40.013 "reset": true, 00:19:40.013 "nvme_admin": false, 00:19:40.013 "nvme_io": false, 00:19:40.013 "nvme_io_md": false, 00:19:40.013 "write_zeroes": true, 00:19:40.013 "zcopy": false, 00:19:40.013 "get_zone_info": false, 00:19:40.013 "zone_management": false, 00:19:40.013 "zone_append": false, 00:19:40.013 "compare": false, 00:19:40.013 "compare_and_write": false, 00:19:40.013 "abort": false, 00:19:40.013 "seek_hole": false, 00:19:40.013 "seek_data": false, 00:19:40.013 "copy": false, 00:19:40.013 "nvme_iov_md": false 00:19:40.013 }, 00:19:40.013 "memory_domains": [ 00:19:40.013 { 00:19:40.013 "dma_device_id": "system", 00:19:40.013 "dma_device_type": 1 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.013 "dma_device_type": 2 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "dma_device_id": "system", 00:19:40.013 "dma_device_type": 1 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.013 "dma_device_type": 2 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "dma_device_id": "system", 00:19:40.013 "dma_device_type": 1 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.013 "dma_device_type": 2 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "dma_device_id": "system", 00:19:40.013 "dma_device_type": 1 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.013 "dma_device_type": 2 00:19:40.013 } 00:19:40.013 ], 00:19:40.013 "driver_specific": { 00:19:40.013 "raid": { 00:19:40.013 "uuid": "ce7180d4-734a-4924-9ffa-fe5cb871eb89", 00:19:40.013 "strip_size_kb": 0, 00:19:40.013 "state": "online", 00:19:40.013 "raid_level": "raid1", 00:19:40.013 "superblock": false, 00:19:40.013 "num_base_bdevs": 4, 00:19:40.013 "num_base_bdevs_discovered": 4, 00:19:40.013 "num_base_bdevs_operational": 4, 00:19:40.013 "base_bdevs_list": [ 00:19:40.013 { 00:19:40.013 "name": "NewBaseBdev", 00:19:40.013 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:40.013 "is_configured": true, 00:19:40.013 "data_offset": 0, 00:19:40.013 "data_size": 65536 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "name": "BaseBdev2", 00:19:40.013 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:40.013 "is_configured": true, 00:19:40.013 "data_offset": 0, 00:19:40.013 "data_size": 65536 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "name": "BaseBdev3", 00:19:40.013 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:40.013 "is_configured": true, 00:19:40.013 "data_offset": 0, 00:19:40.013 "data_size": 65536 00:19:40.013 }, 00:19:40.013 { 00:19:40.013 "name": "BaseBdev4", 00:19:40.013 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:40.013 "is_configured": true, 00:19:40.013 "data_offset": 0, 00:19:40.013 "data_size": 65536 00:19:40.013 } 00:19:40.013 ] 00:19:40.013 } 00:19:40.013 } 00:19:40.013 }' 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:40.013 BaseBdev2 00:19:40.013 BaseBdev3 00:19:40.013 BaseBdev4' 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:40.013 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:40.274 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:40.274 "name": "NewBaseBdev", 00:19:40.274 "aliases": [ 00:19:40.274 "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b" 00:19:40.274 ], 00:19:40.274 "product_name": "Malloc disk", 00:19:40.274 "block_size": 512, 00:19:40.274 "num_blocks": 65536, 00:19:40.274 "uuid": "2b7141c7-bf8b-4c5a-8b3d-f13a38619b3b", 00:19:40.274 "assigned_rate_limits": { 00:19:40.274 "rw_ios_per_sec": 0, 00:19:40.274 "rw_mbytes_per_sec": 0, 00:19:40.274 "r_mbytes_per_sec": 0, 00:19:40.274 "w_mbytes_per_sec": 0 00:19:40.274 }, 00:19:40.274 "claimed": true, 00:19:40.274 "claim_type": "exclusive_write", 00:19:40.274 "zoned": false, 00:19:40.274 "supported_io_types": { 00:19:40.274 "read": true, 00:19:40.274 "write": true, 00:19:40.274 "unmap": true, 00:19:40.274 "flush": true, 00:19:40.274 "reset": true, 00:19:40.274 "nvme_admin": false, 00:19:40.274 "nvme_io": false, 00:19:40.274 "nvme_io_md": false, 00:19:40.274 "write_zeroes": true, 00:19:40.274 "zcopy": true, 00:19:40.274 "get_zone_info": false, 00:19:40.274 "zone_management": false, 00:19:40.274 "zone_append": false, 00:19:40.274 "compare": false, 00:19:40.274 "compare_and_write": false, 00:19:40.274 "abort": true, 00:19:40.274 "seek_hole": false, 00:19:40.274 "seek_data": false, 00:19:40.274 "copy": true, 00:19:40.274 "nvme_iov_md": false 00:19:40.274 }, 00:19:40.274 "memory_domains": [ 00:19:40.274 { 00:19:40.274 "dma_device_id": "system", 00:19:40.274 "dma_device_type": 1 00:19:40.274 }, 00:19:40.274 { 00:19:40.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.274 "dma_device_type": 2 00:19:40.274 } 00:19:40.274 ], 00:19:40.274 "driver_specific": {} 00:19:40.274 }' 00:19:40.274 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.274 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.274 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:40.274 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.274 07:55:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:40.533 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:41.103 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:41.103 "name": "BaseBdev2", 00:19:41.103 "aliases": [ 00:19:41.103 "b91c4cf8-7124-485c-9e88-de2f2aaf0c99" 00:19:41.103 ], 00:19:41.103 "product_name": "Malloc disk", 00:19:41.103 "block_size": 512, 00:19:41.103 "num_blocks": 65536, 00:19:41.103 "uuid": "b91c4cf8-7124-485c-9e88-de2f2aaf0c99", 00:19:41.103 "assigned_rate_limits": { 00:19:41.103 "rw_ios_per_sec": 0, 00:19:41.103 "rw_mbytes_per_sec": 0, 00:19:41.103 "r_mbytes_per_sec": 0, 00:19:41.103 "w_mbytes_per_sec": 0 00:19:41.103 }, 00:19:41.103 "claimed": true, 00:19:41.103 "claim_type": "exclusive_write", 00:19:41.103 "zoned": false, 00:19:41.103 "supported_io_types": { 00:19:41.103 "read": true, 00:19:41.103 "write": true, 00:19:41.103 "unmap": true, 00:19:41.103 "flush": true, 00:19:41.103 "reset": true, 00:19:41.103 "nvme_admin": false, 00:19:41.103 "nvme_io": false, 00:19:41.103 "nvme_io_md": false, 00:19:41.103 "write_zeroes": true, 00:19:41.103 "zcopy": true, 00:19:41.103 "get_zone_info": false, 00:19:41.103 "zone_management": false, 00:19:41.103 "zone_append": false, 00:19:41.103 "compare": false, 00:19:41.103 "compare_and_write": false, 00:19:41.103 "abort": true, 00:19:41.103 "seek_hole": false, 00:19:41.103 "seek_data": false, 00:19:41.103 "copy": true, 00:19:41.103 "nvme_iov_md": false 00:19:41.103 }, 00:19:41.103 "memory_domains": [ 00:19:41.103 { 00:19:41.103 "dma_device_id": "system", 00:19:41.103 "dma_device_type": 1 00:19:41.103 }, 00:19:41.103 { 00:19:41.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.103 "dma_device_type": 2 00:19:41.103 } 00:19:41.103 ], 00:19:41.103 "driver_specific": {} 00:19:41.103 }' 00:19:41.103 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.103 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.364 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:41.364 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.364 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.364 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:41.364 07:55:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.364 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.364 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:41.364 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.364 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.625 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:41.625 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:41.625 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:41.625 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:41.625 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:41.625 "name": "BaseBdev3", 00:19:41.625 "aliases": [ 00:19:41.625 "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439" 00:19:41.625 ], 00:19:41.625 "product_name": "Malloc disk", 00:19:41.625 "block_size": 512, 00:19:41.625 "num_blocks": 65536, 00:19:41.625 "uuid": "2621e80a-72fb-4d9d-a8ea-e0ca7bd55439", 00:19:41.625 "assigned_rate_limits": { 00:19:41.625 "rw_ios_per_sec": 0, 00:19:41.625 "rw_mbytes_per_sec": 0, 00:19:41.625 "r_mbytes_per_sec": 0, 00:19:41.625 "w_mbytes_per_sec": 0 00:19:41.625 }, 00:19:41.625 "claimed": true, 00:19:41.625 "claim_type": "exclusive_write", 00:19:41.625 "zoned": false, 00:19:41.625 "supported_io_types": { 00:19:41.625 "read": true, 00:19:41.625 "write": true, 00:19:41.625 "unmap": true, 00:19:41.625 "flush": true, 00:19:41.625 "reset": true, 00:19:41.625 "nvme_admin": false, 00:19:41.625 "nvme_io": false, 00:19:41.625 "nvme_io_md": false, 00:19:41.625 "write_zeroes": true, 00:19:41.625 "zcopy": true, 00:19:41.625 "get_zone_info": false, 00:19:41.625 "zone_management": false, 00:19:41.625 "zone_append": false, 00:19:41.625 "compare": false, 00:19:41.625 "compare_and_write": false, 00:19:41.625 "abort": true, 00:19:41.625 "seek_hole": false, 00:19:41.625 "seek_data": false, 00:19:41.625 "copy": true, 00:19:41.625 "nvme_iov_md": false 00:19:41.625 }, 00:19:41.625 "memory_domains": [ 00:19:41.625 { 00:19:41.625 "dma_device_id": "system", 00:19:41.625 "dma_device_type": 1 00:19:41.625 }, 00:19:41.625 { 00:19:41.625 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.625 "dma_device_type": 2 00:19:41.625 } 00:19:41.625 ], 00:19:41.625 "driver_specific": {} 00:19:41.625 }' 00:19:41.625 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.886 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.146 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.146 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.146 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:42.146 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.146 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.146 "name": "BaseBdev4", 00:19:42.146 "aliases": [ 00:19:42.146 "2628d0c6-e214-4579-8ce5-cecb07c1e27f" 00:19:42.146 ], 00:19:42.146 "product_name": "Malloc disk", 00:19:42.146 "block_size": 512, 00:19:42.146 "num_blocks": 65536, 00:19:42.146 "uuid": "2628d0c6-e214-4579-8ce5-cecb07c1e27f", 00:19:42.146 "assigned_rate_limits": { 00:19:42.146 "rw_ios_per_sec": 0, 00:19:42.146 "rw_mbytes_per_sec": 0, 00:19:42.146 "r_mbytes_per_sec": 0, 00:19:42.146 "w_mbytes_per_sec": 0 00:19:42.146 }, 00:19:42.146 "claimed": true, 00:19:42.146 "claim_type": "exclusive_write", 00:19:42.146 "zoned": false, 00:19:42.146 "supported_io_types": { 00:19:42.146 "read": true, 00:19:42.146 "write": true, 00:19:42.146 "unmap": true, 00:19:42.146 "flush": true, 00:19:42.146 "reset": true, 00:19:42.146 "nvme_admin": false, 00:19:42.146 "nvme_io": false, 00:19:42.146 "nvme_io_md": false, 00:19:42.146 "write_zeroes": true, 00:19:42.146 "zcopy": true, 00:19:42.146 "get_zone_info": false, 00:19:42.146 "zone_management": false, 00:19:42.146 "zone_append": false, 00:19:42.146 "compare": false, 00:19:42.146 "compare_and_write": false, 00:19:42.146 "abort": true, 00:19:42.146 "seek_hole": false, 00:19:42.146 "seek_data": false, 00:19:42.146 "copy": true, 00:19:42.146 "nvme_iov_md": false 00:19:42.146 }, 00:19:42.146 "memory_domains": [ 00:19:42.146 { 00:19:42.146 "dma_device_id": "system", 00:19:42.146 "dma_device_type": 1 00:19:42.146 }, 00:19:42.146 { 00:19:42.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.146 "dma_device_type": 2 00:19:42.146 } 00:19:42.146 ], 00:19:42.146 "driver_specific": {} 00:19:42.146 }' 00:19:42.146 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.406 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.406 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.406 07:55:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.406 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.406 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.406 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.406 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.667 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.667 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.667 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.667 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.667 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:42.928 [2024-07-15 07:55:27.504191] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:42.928 [2024-07-15 07:55:27.504211] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:42.928 [2024-07-15 07:55:27.504250] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:42.928 [2024-07-15 07:55:27.504457] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:42.928 [2024-07-15 07:55:27.504463] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7dc1a0 name Existed_Raid, state offline 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1686720 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1686720 ']' 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1686720 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1686720 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1686720' 00:19:42.928 killing process with pid 1686720 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1686720 00:19:42.928 [2024-07-15 07:55:27.590872] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:42.928 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1686720 00:19:42.928 [2024-07-15 07:55:27.611284] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:43.191 00:19:43.191 real 0m28.811s 00:19:43.191 user 0m54.662s 00:19:43.191 sys 0m4.060s 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.191 ************************************ 00:19:43.191 END TEST raid_state_function_test 00:19:43.191 ************************************ 00:19:43.191 07:55:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:43.191 07:55:27 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:19:43.191 07:55:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:43.191 07:55:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:43.191 07:55:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:43.191 ************************************ 00:19:43.191 START TEST raid_state_function_test_sb 00:19:43.191 ************************************ 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1692218 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1692218' 00:19:43.191 Process raid pid: 1692218 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1692218 /var/tmp/spdk-raid.sock 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1692218 ']' 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:43.191 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:43.191 07:55:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:43.191 [2024-07-15 07:55:27.873743] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:19:43.191 [2024-07-15 07:55:27.873799] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:43.452 [2024-07-15 07:55:27.966181] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:43.452 [2024-07-15 07:55:28.042173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:43.452 [2024-07-15 07:55:28.085059] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:43.452 [2024-07-15 07:55:28.085082] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:44.026 07:55:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:44.026 07:55:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:44.026 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:44.287 [2024-07-15 07:55:28.900524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:44.287 [2024-07-15 07:55:28.900551] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:44.287 [2024-07-15 07:55:28.900557] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:44.287 [2024-07-15 07:55:28.900563] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:44.287 [2024-07-15 07:55:28.900572] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:44.287 [2024-07-15 07:55:28.900577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:44.287 [2024-07-15 07:55:28.900582] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:44.287 [2024-07-15 07:55:28.900587] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.287 07:55:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:44.547 07:55:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.547 "name": "Existed_Raid", 00:19:44.547 "uuid": "966f11d9-48c6-400a-8636-3aa4355a946a", 00:19:44.547 "strip_size_kb": 0, 00:19:44.547 "state": "configuring", 00:19:44.547 "raid_level": "raid1", 00:19:44.547 "superblock": true, 00:19:44.547 "num_base_bdevs": 4, 00:19:44.547 "num_base_bdevs_discovered": 0, 00:19:44.548 "num_base_bdevs_operational": 4, 00:19:44.548 "base_bdevs_list": [ 00:19:44.548 { 00:19:44.548 "name": "BaseBdev1", 00:19:44.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.548 "is_configured": false, 00:19:44.548 "data_offset": 0, 00:19:44.548 "data_size": 0 00:19:44.548 }, 00:19:44.548 { 00:19:44.548 "name": "BaseBdev2", 00:19:44.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.548 "is_configured": false, 00:19:44.548 "data_offset": 0, 00:19:44.548 "data_size": 0 00:19:44.548 }, 00:19:44.548 { 00:19:44.548 "name": "BaseBdev3", 00:19:44.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.548 "is_configured": false, 00:19:44.548 "data_offset": 0, 00:19:44.548 "data_size": 0 00:19:44.548 }, 00:19:44.548 { 00:19:44.548 "name": "BaseBdev4", 00:19:44.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.548 "is_configured": false, 00:19:44.548 "data_offset": 0, 00:19:44.548 "data_size": 0 00:19:44.548 } 00:19:44.548 ] 00:19:44.548 }' 00:19:44.548 07:55:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.548 07:55:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:45.120 07:55:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:45.381 [2024-07-15 07:55:29.955062] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:45.381 [2024-07-15 07:55:29.955081] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20636f0 name Existed_Raid, state configuring 00:19:45.381 07:55:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:45.641 [2024-07-15 07:55:30.167639] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:45.641 [2024-07-15 07:55:30.167670] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:45.641 [2024-07-15 07:55:30.167680] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:45.641 [2024-07-15 07:55:30.167685] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:45.641 [2024-07-15 07:55:30.167690] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:45.641 [2024-07-15 07:55:30.167695] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:45.641 [2024-07-15 07:55:30.167700] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:45.641 [2024-07-15 07:55:30.167705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:45.641 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:45.641 [2024-07-15 07:55:30.370425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:45.641 BaseBdev1 00:19:45.641 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:45.641 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:45.641 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:45.641 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:45.641 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:45.641 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:45.641 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:45.902 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:46.163 [ 00:19:46.163 { 00:19:46.163 "name": "BaseBdev1", 00:19:46.163 "aliases": [ 00:19:46.163 "ce64cb49-49b6-4a74-a75a-5ffcce9572df" 00:19:46.163 ], 00:19:46.163 "product_name": "Malloc disk", 00:19:46.163 "block_size": 512, 00:19:46.163 "num_blocks": 65536, 00:19:46.163 "uuid": "ce64cb49-49b6-4a74-a75a-5ffcce9572df", 00:19:46.163 "assigned_rate_limits": { 00:19:46.163 "rw_ios_per_sec": 0, 00:19:46.163 "rw_mbytes_per_sec": 0, 00:19:46.163 "r_mbytes_per_sec": 0, 00:19:46.163 "w_mbytes_per_sec": 0 00:19:46.163 }, 00:19:46.163 "claimed": true, 00:19:46.163 "claim_type": "exclusive_write", 00:19:46.163 "zoned": false, 00:19:46.163 "supported_io_types": { 00:19:46.163 "read": true, 00:19:46.163 "write": true, 00:19:46.163 "unmap": true, 00:19:46.163 "flush": true, 00:19:46.163 "reset": true, 00:19:46.163 "nvme_admin": false, 00:19:46.163 "nvme_io": false, 00:19:46.163 "nvme_io_md": false, 00:19:46.163 "write_zeroes": true, 00:19:46.163 "zcopy": true, 00:19:46.163 "get_zone_info": false, 00:19:46.163 "zone_management": false, 00:19:46.163 "zone_append": false, 00:19:46.163 "compare": false, 00:19:46.163 "compare_and_write": false, 00:19:46.163 "abort": true, 00:19:46.163 "seek_hole": false, 00:19:46.163 "seek_data": false, 00:19:46.163 "copy": true, 00:19:46.163 "nvme_iov_md": false 00:19:46.163 }, 00:19:46.163 "memory_domains": [ 00:19:46.163 { 00:19:46.163 "dma_device_id": "system", 00:19:46.163 "dma_device_type": 1 00:19:46.163 }, 00:19:46.163 { 00:19:46.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.163 "dma_device_type": 2 00:19:46.163 } 00:19:46.163 ], 00:19:46.163 "driver_specific": {} 00:19:46.163 } 00:19:46.163 ] 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.163 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.164 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.164 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.164 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.164 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.424 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.424 "name": "Existed_Raid", 00:19:46.424 "uuid": "3cb45f5c-6693-41af-af03-24a3a43086e8", 00:19:46.424 "strip_size_kb": 0, 00:19:46.424 "state": "configuring", 00:19:46.424 "raid_level": "raid1", 00:19:46.424 "superblock": true, 00:19:46.424 "num_base_bdevs": 4, 00:19:46.424 "num_base_bdevs_discovered": 1, 00:19:46.424 "num_base_bdevs_operational": 4, 00:19:46.424 "base_bdevs_list": [ 00:19:46.424 { 00:19:46.424 "name": "BaseBdev1", 00:19:46.424 "uuid": "ce64cb49-49b6-4a74-a75a-5ffcce9572df", 00:19:46.424 "is_configured": true, 00:19:46.424 "data_offset": 2048, 00:19:46.424 "data_size": 63488 00:19:46.424 }, 00:19:46.424 { 00:19:46.424 "name": "BaseBdev2", 00:19:46.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.424 "is_configured": false, 00:19:46.424 "data_offset": 0, 00:19:46.424 "data_size": 0 00:19:46.424 }, 00:19:46.424 { 00:19:46.424 "name": "BaseBdev3", 00:19:46.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.424 "is_configured": false, 00:19:46.424 "data_offset": 0, 00:19:46.424 "data_size": 0 00:19:46.424 }, 00:19:46.424 { 00:19:46.424 "name": "BaseBdev4", 00:19:46.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.424 "is_configured": false, 00:19:46.424 "data_offset": 0, 00:19:46.424 "data_size": 0 00:19:46.424 } 00:19:46.424 ] 00:19:46.424 }' 00:19:46.424 07:55:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.424 07:55:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:46.994 07:55:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:47.254 [2024-07-15 07:55:31.906315] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:47.254 [2024-07-15 07:55:31.906348] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2062f60 name Existed_Raid, state configuring 00:19:47.254 07:55:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:47.514 [2024-07-15 07:55:32.082792] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:47.514 [2024-07-15 07:55:32.083892] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:47.514 [2024-07-15 07:55:32.083916] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:47.514 [2024-07-15 07:55:32.083922] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:47.514 [2024-07-15 07:55:32.083928] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:47.514 [2024-07-15 07:55:32.083933] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:47.514 [2024-07-15 07:55:32.083938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.514 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:47.774 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.774 "name": "Existed_Raid", 00:19:47.774 "uuid": "10781b86-cdb5-4c88-9881-d67e4dcd1ad1", 00:19:47.774 "strip_size_kb": 0, 00:19:47.774 "state": "configuring", 00:19:47.774 "raid_level": "raid1", 00:19:47.774 "superblock": true, 00:19:47.774 "num_base_bdevs": 4, 00:19:47.774 "num_base_bdevs_discovered": 1, 00:19:47.774 "num_base_bdevs_operational": 4, 00:19:47.774 "base_bdevs_list": [ 00:19:47.774 { 00:19:47.774 "name": "BaseBdev1", 00:19:47.774 "uuid": "ce64cb49-49b6-4a74-a75a-5ffcce9572df", 00:19:47.774 "is_configured": true, 00:19:47.774 "data_offset": 2048, 00:19:47.774 "data_size": 63488 00:19:47.774 }, 00:19:47.774 { 00:19:47.774 "name": "BaseBdev2", 00:19:47.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.774 "is_configured": false, 00:19:47.774 "data_offset": 0, 00:19:47.774 "data_size": 0 00:19:47.774 }, 00:19:47.774 { 00:19:47.774 "name": "BaseBdev3", 00:19:47.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.774 "is_configured": false, 00:19:47.774 "data_offset": 0, 00:19:47.774 "data_size": 0 00:19:47.774 }, 00:19:47.774 { 00:19:47.774 "name": "BaseBdev4", 00:19:47.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.774 "is_configured": false, 00:19:47.774 "data_offset": 0, 00:19:47.774 "data_size": 0 00:19:47.774 } 00:19:47.774 ] 00:19:47.774 }' 00:19:47.774 07:55:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.774 07:55:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:48.789 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:48.789 [2024-07-15 07:55:33.399232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:48.789 BaseBdev2 00:19:48.789 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:48.789 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:48.789 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:48.789 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:48.789 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:48.789 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:48.789 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:49.049 [ 00:19:49.049 { 00:19:49.049 "name": "BaseBdev2", 00:19:49.049 "aliases": [ 00:19:49.049 "827cfeeb-42cb-41a4-b9f9-472df378e217" 00:19:49.049 ], 00:19:49.049 "product_name": "Malloc disk", 00:19:49.049 "block_size": 512, 00:19:49.049 "num_blocks": 65536, 00:19:49.049 "uuid": "827cfeeb-42cb-41a4-b9f9-472df378e217", 00:19:49.049 "assigned_rate_limits": { 00:19:49.049 "rw_ios_per_sec": 0, 00:19:49.049 "rw_mbytes_per_sec": 0, 00:19:49.049 "r_mbytes_per_sec": 0, 00:19:49.049 "w_mbytes_per_sec": 0 00:19:49.049 }, 00:19:49.049 "claimed": true, 00:19:49.049 "claim_type": "exclusive_write", 00:19:49.049 "zoned": false, 00:19:49.049 "supported_io_types": { 00:19:49.049 "read": true, 00:19:49.049 "write": true, 00:19:49.049 "unmap": true, 00:19:49.049 "flush": true, 00:19:49.049 "reset": true, 00:19:49.049 "nvme_admin": false, 00:19:49.049 "nvme_io": false, 00:19:49.049 "nvme_io_md": false, 00:19:49.049 "write_zeroes": true, 00:19:49.049 "zcopy": true, 00:19:49.049 "get_zone_info": false, 00:19:49.049 "zone_management": false, 00:19:49.049 "zone_append": false, 00:19:49.049 "compare": false, 00:19:49.049 "compare_and_write": false, 00:19:49.049 "abort": true, 00:19:49.049 "seek_hole": false, 00:19:49.049 "seek_data": false, 00:19:49.049 "copy": true, 00:19:49.049 "nvme_iov_md": false 00:19:49.049 }, 00:19:49.049 "memory_domains": [ 00:19:49.049 { 00:19:49.049 "dma_device_id": "system", 00:19:49.049 "dma_device_type": 1 00:19:49.049 }, 00:19:49.049 { 00:19:49.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.049 "dma_device_type": 2 00:19:49.049 } 00:19:49.049 ], 00:19:49.049 "driver_specific": {} 00:19:49.049 } 00:19:49.049 ] 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:49.049 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.308 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.308 "name": "Existed_Raid", 00:19:49.308 "uuid": "10781b86-cdb5-4c88-9881-d67e4dcd1ad1", 00:19:49.308 "strip_size_kb": 0, 00:19:49.308 "state": "configuring", 00:19:49.308 "raid_level": "raid1", 00:19:49.308 "superblock": true, 00:19:49.308 "num_base_bdevs": 4, 00:19:49.308 "num_base_bdevs_discovered": 2, 00:19:49.308 "num_base_bdevs_operational": 4, 00:19:49.308 "base_bdevs_list": [ 00:19:49.308 { 00:19:49.308 "name": "BaseBdev1", 00:19:49.308 "uuid": "ce64cb49-49b6-4a74-a75a-5ffcce9572df", 00:19:49.308 "is_configured": true, 00:19:49.308 "data_offset": 2048, 00:19:49.308 "data_size": 63488 00:19:49.308 }, 00:19:49.308 { 00:19:49.308 "name": "BaseBdev2", 00:19:49.308 "uuid": "827cfeeb-42cb-41a4-b9f9-472df378e217", 00:19:49.308 "is_configured": true, 00:19:49.308 "data_offset": 2048, 00:19:49.308 "data_size": 63488 00:19:49.308 }, 00:19:49.308 { 00:19:49.308 "name": "BaseBdev3", 00:19:49.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.308 "is_configured": false, 00:19:49.308 "data_offset": 0, 00:19:49.308 "data_size": 0 00:19:49.308 }, 00:19:49.308 { 00:19:49.308 "name": "BaseBdev4", 00:19:49.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.308 "is_configured": false, 00:19:49.308 "data_offset": 0, 00:19:49.308 "data_size": 0 00:19:49.308 } 00:19:49.308 ] 00:19:49.308 }' 00:19:49.308 07:55:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.308 07:55:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:49.876 07:55:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:50.138 [2024-07-15 07:55:34.647163] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:50.138 BaseBdev3 00:19:50.138 07:55:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:50.138 07:55:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:50.138 07:55:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:50.138 07:55:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:50.138 07:55:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:50.138 07:55:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:50.138 07:55:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:50.138 07:55:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:50.397 [ 00:19:50.397 { 00:19:50.397 "name": "BaseBdev3", 00:19:50.397 "aliases": [ 00:19:50.397 "087bfbd7-b198-4039-9fda-25ce6df52f7a" 00:19:50.397 ], 00:19:50.397 "product_name": "Malloc disk", 00:19:50.397 "block_size": 512, 00:19:50.397 "num_blocks": 65536, 00:19:50.397 "uuid": "087bfbd7-b198-4039-9fda-25ce6df52f7a", 00:19:50.397 "assigned_rate_limits": { 00:19:50.397 "rw_ios_per_sec": 0, 00:19:50.397 "rw_mbytes_per_sec": 0, 00:19:50.397 "r_mbytes_per_sec": 0, 00:19:50.397 "w_mbytes_per_sec": 0 00:19:50.397 }, 00:19:50.397 "claimed": true, 00:19:50.397 "claim_type": "exclusive_write", 00:19:50.397 "zoned": false, 00:19:50.397 "supported_io_types": { 00:19:50.397 "read": true, 00:19:50.397 "write": true, 00:19:50.397 "unmap": true, 00:19:50.397 "flush": true, 00:19:50.397 "reset": true, 00:19:50.397 "nvme_admin": false, 00:19:50.397 "nvme_io": false, 00:19:50.397 "nvme_io_md": false, 00:19:50.397 "write_zeroes": true, 00:19:50.397 "zcopy": true, 00:19:50.397 "get_zone_info": false, 00:19:50.397 "zone_management": false, 00:19:50.397 "zone_append": false, 00:19:50.397 "compare": false, 00:19:50.397 "compare_and_write": false, 00:19:50.397 "abort": true, 00:19:50.397 "seek_hole": false, 00:19:50.397 "seek_data": false, 00:19:50.397 "copy": true, 00:19:50.397 "nvme_iov_md": false 00:19:50.397 }, 00:19:50.397 "memory_domains": [ 00:19:50.397 { 00:19:50.397 "dma_device_id": "system", 00:19:50.397 "dma_device_type": 1 00:19:50.397 }, 00:19:50.397 { 00:19:50.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.397 "dma_device_type": 2 00:19:50.397 } 00:19:50.397 ], 00:19:50.397 "driver_specific": {} 00:19:50.397 } 00:19:50.397 ] 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.397 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:50.656 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.656 "name": "Existed_Raid", 00:19:50.656 "uuid": "10781b86-cdb5-4c88-9881-d67e4dcd1ad1", 00:19:50.656 "strip_size_kb": 0, 00:19:50.656 "state": "configuring", 00:19:50.656 "raid_level": "raid1", 00:19:50.656 "superblock": true, 00:19:50.656 "num_base_bdevs": 4, 00:19:50.656 "num_base_bdevs_discovered": 3, 00:19:50.656 "num_base_bdevs_operational": 4, 00:19:50.656 "base_bdevs_list": [ 00:19:50.656 { 00:19:50.656 "name": "BaseBdev1", 00:19:50.656 "uuid": "ce64cb49-49b6-4a74-a75a-5ffcce9572df", 00:19:50.656 "is_configured": true, 00:19:50.656 "data_offset": 2048, 00:19:50.656 "data_size": 63488 00:19:50.656 }, 00:19:50.656 { 00:19:50.656 "name": "BaseBdev2", 00:19:50.656 "uuid": "827cfeeb-42cb-41a4-b9f9-472df378e217", 00:19:50.656 "is_configured": true, 00:19:50.656 "data_offset": 2048, 00:19:50.656 "data_size": 63488 00:19:50.656 }, 00:19:50.656 { 00:19:50.656 "name": "BaseBdev3", 00:19:50.656 "uuid": "087bfbd7-b198-4039-9fda-25ce6df52f7a", 00:19:50.656 "is_configured": true, 00:19:50.656 "data_offset": 2048, 00:19:50.656 "data_size": 63488 00:19:50.656 }, 00:19:50.656 { 00:19:50.656 "name": "BaseBdev4", 00:19:50.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.656 "is_configured": false, 00:19:50.656 "data_offset": 0, 00:19:50.656 "data_size": 0 00:19:50.656 } 00:19:50.656 ] 00:19:50.656 }' 00:19:50.656 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.656 07:55:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:51.223 07:55:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:51.483 [2024-07-15 07:55:35.995491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:51.483 [2024-07-15 07:55:35.995621] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2063fc0 00:19:51.483 [2024-07-15 07:55:35.995629] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:51.483 [2024-07-15 07:55:35.995773] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2063c00 00:19:51.483 [2024-07-15 07:55:35.995875] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2063fc0 00:19:51.483 [2024-07-15 07:55:35.995881] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2063fc0 00:19:51.483 [2024-07-15 07:55:35.995948] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:51.483 BaseBdev4 00:19:51.483 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:51.483 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:51.483 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:51.483 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:51.483 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:51.483 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:51.483 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:51.483 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:51.743 [ 00:19:51.743 { 00:19:51.743 "name": "BaseBdev4", 00:19:51.743 "aliases": [ 00:19:51.743 "55bbf182-8872-435f-a4b5-dcfc3389ebf1" 00:19:51.743 ], 00:19:51.743 "product_name": "Malloc disk", 00:19:51.743 "block_size": 512, 00:19:51.743 "num_blocks": 65536, 00:19:51.743 "uuid": "55bbf182-8872-435f-a4b5-dcfc3389ebf1", 00:19:51.743 "assigned_rate_limits": { 00:19:51.743 "rw_ios_per_sec": 0, 00:19:51.743 "rw_mbytes_per_sec": 0, 00:19:51.743 "r_mbytes_per_sec": 0, 00:19:51.743 "w_mbytes_per_sec": 0 00:19:51.743 }, 00:19:51.743 "claimed": true, 00:19:51.743 "claim_type": "exclusive_write", 00:19:51.743 "zoned": false, 00:19:51.743 "supported_io_types": { 00:19:51.743 "read": true, 00:19:51.743 "write": true, 00:19:51.743 "unmap": true, 00:19:51.743 "flush": true, 00:19:51.743 "reset": true, 00:19:51.743 "nvme_admin": false, 00:19:51.743 "nvme_io": false, 00:19:51.743 "nvme_io_md": false, 00:19:51.743 "write_zeroes": true, 00:19:51.743 "zcopy": true, 00:19:51.743 "get_zone_info": false, 00:19:51.743 "zone_management": false, 00:19:51.743 "zone_append": false, 00:19:51.743 "compare": false, 00:19:51.743 "compare_and_write": false, 00:19:51.743 "abort": true, 00:19:51.743 "seek_hole": false, 00:19:51.743 "seek_data": false, 00:19:51.743 "copy": true, 00:19:51.743 "nvme_iov_md": false 00:19:51.743 }, 00:19:51.743 "memory_domains": [ 00:19:51.743 { 00:19:51.743 "dma_device_id": "system", 00:19:51.743 "dma_device_type": 1 00:19:51.743 }, 00:19:51.743 { 00:19:51.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.743 "dma_device_type": 2 00:19:51.743 } 00:19:51.743 ], 00:19:51.743 "driver_specific": {} 00:19:51.743 } 00:19:51.743 ] 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.743 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:52.002 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.002 "name": "Existed_Raid", 00:19:52.002 "uuid": "10781b86-cdb5-4c88-9881-d67e4dcd1ad1", 00:19:52.002 "strip_size_kb": 0, 00:19:52.002 "state": "online", 00:19:52.002 "raid_level": "raid1", 00:19:52.002 "superblock": true, 00:19:52.002 "num_base_bdevs": 4, 00:19:52.002 "num_base_bdevs_discovered": 4, 00:19:52.002 "num_base_bdevs_operational": 4, 00:19:52.002 "base_bdevs_list": [ 00:19:52.002 { 00:19:52.002 "name": "BaseBdev1", 00:19:52.002 "uuid": "ce64cb49-49b6-4a74-a75a-5ffcce9572df", 00:19:52.002 "is_configured": true, 00:19:52.002 "data_offset": 2048, 00:19:52.002 "data_size": 63488 00:19:52.002 }, 00:19:52.002 { 00:19:52.002 "name": "BaseBdev2", 00:19:52.002 "uuid": "827cfeeb-42cb-41a4-b9f9-472df378e217", 00:19:52.002 "is_configured": true, 00:19:52.002 "data_offset": 2048, 00:19:52.002 "data_size": 63488 00:19:52.002 }, 00:19:52.002 { 00:19:52.002 "name": "BaseBdev3", 00:19:52.002 "uuid": "087bfbd7-b198-4039-9fda-25ce6df52f7a", 00:19:52.003 "is_configured": true, 00:19:52.003 "data_offset": 2048, 00:19:52.003 "data_size": 63488 00:19:52.003 }, 00:19:52.003 { 00:19:52.003 "name": "BaseBdev4", 00:19:52.003 "uuid": "55bbf182-8872-435f-a4b5-dcfc3389ebf1", 00:19:52.003 "is_configured": true, 00:19:52.003 "data_offset": 2048, 00:19:52.003 "data_size": 63488 00:19:52.003 } 00:19:52.003 ] 00:19:52.003 }' 00:19:52.003 07:55:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.003 07:55:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:52.943 [2024-07-15 07:55:37.615877] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:52.943 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:52.944 "name": "Existed_Raid", 00:19:52.944 "aliases": [ 00:19:52.944 "10781b86-cdb5-4c88-9881-d67e4dcd1ad1" 00:19:52.944 ], 00:19:52.944 "product_name": "Raid Volume", 00:19:52.944 "block_size": 512, 00:19:52.944 "num_blocks": 63488, 00:19:52.944 "uuid": "10781b86-cdb5-4c88-9881-d67e4dcd1ad1", 00:19:52.944 "assigned_rate_limits": { 00:19:52.944 "rw_ios_per_sec": 0, 00:19:52.944 "rw_mbytes_per_sec": 0, 00:19:52.944 "r_mbytes_per_sec": 0, 00:19:52.944 "w_mbytes_per_sec": 0 00:19:52.944 }, 00:19:52.944 "claimed": false, 00:19:52.944 "zoned": false, 00:19:52.944 "supported_io_types": { 00:19:52.944 "read": true, 00:19:52.944 "write": true, 00:19:52.944 "unmap": false, 00:19:52.944 "flush": false, 00:19:52.944 "reset": true, 00:19:52.944 "nvme_admin": false, 00:19:52.944 "nvme_io": false, 00:19:52.944 "nvme_io_md": false, 00:19:52.944 "write_zeroes": true, 00:19:52.944 "zcopy": false, 00:19:52.944 "get_zone_info": false, 00:19:52.944 "zone_management": false, 00:19:52.944 "zone_append": false, 00:19:52.944 "compare": false, 00:19:52.944 "compare_and_write": false, 00:19:52.944 "abort": false, 00:19:52.944 "seek_hole": false, 00:19:52.944 "seek_data": false, 00:19:52.944 "copy": false, 00:19:52.944 "nvme_iov_md": false 00:19:52.944 }, 00:19:52.944 "memory_domains": [ 00:19:52.944 { 00:19:52.944 "dma_device_id": "system", 00:19:52.944 "dma_device_type": 1 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.944 "dma_device_type": 2 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "dma_device_id": "system", 00:19:52.944 "dma_device_type": 1 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.944 "dma_device_type": 2 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "dma_device_id": "system", 00:19:52.944 "dma_device_type": 1 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.944 "dma_device_type": 2 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "dma_device_id": "system", 00:19:52.944 "dma_device_type": 1 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.944 "dma_device_type": 2 00:19:52.944 } 00:19:52.944 ], 00:19:52.944 "driver_specific": { 00:19:52.944 "raid": { 00:19:52.944 "uuid": "10781b86-cdb5-4c88-9881-d67e4dcd1ad1", 00:19:52.944 "strip_size_kb": 0, 00:19:52.944 "state": "online", 00:19:52.944 "raid_level": "raid1", 00:19:52.944 "superblock": true, 00:19:52.944 "num_base_bdevs": 4, 00:19:52.944 "num_base_bdevs_discovered": 4, 00:19:52.944 "num_base_bdevs_operational": 4, 00:19:52.944 "base_bdevs_list": [ 00:19:52.944 { 00:19:52.944 "name": "BaseBdev1", 00:19:52.944 "uuid": "ce64cb49-49b6-4a74-a75a-5ffcce9572df", 00:19:52.944 "is_configured": true, 00:19:52.944 "data_offset": 2048, 00:19:52.944 "data_size": 63488 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "name": "BaseBdev2", 00:19:52.944 "uuid": "827cfeeb-42cb-41a4-b9f9-472df378e217", 00:19:52.944 "is_configured": true, 00:19:52.944 "data_offset": 2048, 00:19:52.944 "data_size": 63488 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "name": "BaseBdev3", 00:19:52.944 "uuid": "087bfbd7-b198-4039-9fda-25ce6df52f7a", 00:19:52.944 "is_configured": true, 00:19:52.944 "data_offset": 2048, 00:19:52.944 "data_size": 63488 00:19:52.944 }, 00:19:52.944 { 00:19:52.944 "name": "BaseBdev4", 00:19:52.944 "uuid": "55bbf182-8872-435f-a4b5-dcfc3389ebf1", 00:19:52.944 "is_configured": true, 00:19:52.944 "data_offset": 2048, 00:19:52.944 "data_size": 63488 00:19:52.944 } 00:19:52.944 ] 00:19:52.944 } 00:19:52.944 } 00:19:52.944 }' 00:19:52.944 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:52.944 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:52.944 BaseBdev2 00:19:52.944 BaseBdev3 00:19:52.944 BaseBdev4' 00:19:52.944 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:52.944 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:52.944 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:53.204 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:53.204 "name": "BaseBdev1", 00:19:53.204 "aliases": [ 00:19:53.204 "ce64cb49-49b6-4a74-a75a-5ffcce9572df" 00:19:53.204 ], 00:19:53.204 "product_name": "Malloc disk", 00:19:53.204 "block_size": 512, 00:19:53.204 "num_blocks": 65536, 00:19:53.204 "uuid": "ce64cb49-49b6-4a74-a75a-5ffcce9572df", 00:19:53.204 "assigned_rate_limits": { 00:19:53.204 "rw_ios_per_sec": 0, 00:19:53.204 "rw_mbytes_per_sec": 0, 00:19:53.204 "r_mbytes_per_sec": 0, 00:19:53.204 "w_mbytes_per_sec": 0 00:19:53.204 }, 00:19:53.204 "claimed": true, 00:19:53.204 "claim_type": "exclusive_write", 00:19:53.204 "zoned": false, 00:19:53.204 "supported_io_types": { 00:19:53.204 "read": true, 00:19:53.204 "write": true, 00:19:53.204 "unmap": true, 00:19:53.204 "flush": true, 00:19:53.204 "reset": true, 00:19:53.204 "nvme_admin": false, 00:19:53.204 "nvme_io": false, 00:19:53.204 "nvme_io_md": false, 00:19:53.204 "write_zeroes": true, 00:19:53.204 "zcopy": true, 00:19:53.204 "get_zone_info": false, 00:19:53.204 "zone_management": false, 00:19:53.204 "zone_append": false, 00:19:53.204 "compare": false, 00:19:53.204 "compare_and_write": false, 00:19:53.204 "abort": true, 00:19:53.204 "seek_hole": false, 00:19:53.204 "seek_data": false, 00:19:53.204 "copy": true, 00:19:53.204 "nvme_iov_md": false 00:19:53.204 }, 00:19:53.204 "memory_domains": [ 00:19:53.204 { 00:19:53.204 "dma_device_id": "system", 00:19:53.204 "dma_device_type": 1 00:19:53.204 }, 00:19:53.205 { 00:19:53.205 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.205 "dma_device_type": 2 00:19:53.205 } 00:19:53.205 ], 00:19:53.205 "driver_specific": {} 00:19:53.205 }' 00:19:53.205 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.205 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.465 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:53.465 07:55:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.465 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.465 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:53.465 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.465 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.465 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:53.465 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.465 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.725 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:53.725 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:53.725 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:53.725 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:53.725 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:53.725 "name": "BaseBdev2", 00:19:53.725 "aliases": [ 00:19:53.725 "827cfeeb-42cb-41a4-b9f9-472df378e217" 00:19:53.725 ], 00:19:53.725 "product_name": "Malloc disk", 00:19:53.725 "block_size": 512, 00:19:53.725 "num_blocks": 65536, 00:19:53.725 "uuid": "827cfeeb-42cb-41a4-b9f9-472df378e217", 00:19:53.725 "assigned_rate_limits": { 00:19:53.725 "rw_ios_per_sec": 0, 00:19:53.725 "rw_mbytes_per_sec": 0, 00:19:53.725 "r_mbytes_per_sec": 0, 00:19:53.725 "w_mbytes_per_sec": 0 00:19:53.725 }, 00:19:53.725 "claimed": true, 00:19:53.725 "claim_type": "exclusive_write", 00:19:53.725 "zoned": false, 00:19:53.725 "supported_io_types": { 00:19:53.725 "read": true, 00:19:53.725 "write": true, 00:19:53.725 "unmap": true, 00:19:53.725 "flush": true, 00:19:53.725 "reset": true, 00:19:53.725 "nvme_admin": false, 00:19:53.725 "nvme_io": false, 00:19:53.725 "nvme_io_md": false, 00:19:53.725 "write_zeroes": true, 00:19:53.725 "zcopy": true, 00:19:53.725 "get_zone_info": false, 00:19:53.725 "zone_management": false, 00:19:53.725 "zone_append": false, 00:19:53.725 "compare": false, 00:19:53.725 "compare_and_write": false, 00:19:53.725 "abort": true, 00:19:53.725 "seek_hole": false, 00:19:53.725 "seek_data": false, 00:19:53.725 "copy": true, 00:19:53.725 "nvme_iov_md": false 00:19:53.725 }, 00:19:53.725 "memory_domains": [ 00:19:53.725 { 00:19:53.725 "dma_device_id": "system", 00:19:53.725 "dma_device_type": 1 00:19:53.725 }, 00:19:53.725 { 00:19:53.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.725 "dma_device_type": 2 00:19:53.725 } 00:19:53.725 ], 00:19:53.725 "driver_specific": {} 00:19:53.725 }' 00:19:53.725 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.725 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.985 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:53.985 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.985 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.985 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:53.985 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.985 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.985 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:53.985 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.245 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.245 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:54.245 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:54.245 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:54.245 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:54.245 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:54.245 "name": "BaseBdev3", 00:19:54.245 "aliases": [ 00:19:54.245 "087bfbd7-b198-4039-9fda-25ce6df52f7a" 00:19:54.246 ], 00:19:54.246 "product_name": "Malloc disk", 00:19:54.246 "block_size": 512, 00:19:54.246 "num_blocks": 65536, 00:19:54.246 "uuid": "087bfbd7-b198-4039-9fda-25ce6df52f7a", 00:19:54.246 "assigned_rate_limits": { 00:19:54.246 "rw_ios_per_sec": 0, 00:19:54.246 "rw_mbytes_per_sec": 0, 00:19:54.246 "r_mbytes_per_sec": 0, 00:19:54.246 "w_mbytes_per_sec": 0 00:19:54.246 }, 00:19:54.246 "claimed": true, 00:19:54.246 "claim_type": "exclusive_write", 00:19:54.246 "zoned": false, 00:19:54.246 "supported_io_types": { 00:19:54.246 "read": true, 00:19:54.246 "write": true, 00:19:54.246 "unmap": true, 00:19:54.246 "flush": true, 00:19:54.246 "reset": true, 00:19:54.246 "nvme_admin": false, 00:19:54.246 "nvme_io": false, 00:19:54.246 "nvme_io_md": false, 00:19:54.246 "write_zeroes": true, 00:19:54.246 "zcopy": true, 00:19:54.246 "get_zone_info": false, 00:19:54.246 "zone_management": false, 00:19:54.246 "zone_append": false, 00:19:54.246 "compare": false, 00:19:54.246 "compare_and_write": false, 00:19:54.246 "abort": true, 00:19:54.246 "seek_hole": false, 00:19:54.246 "seek_data": false, 00:19:54.246 "copy": true, 00:19:54.246 "nvme_iov_md": false 00:19:54.246 }, 00:19:54.246 "memory_domains": [ 00:19:54.246 { 00:19:54.246 "dma_device_id": "system", 00:19:54.246 "dma_device_type": 1 00:19:54.246 }, 00:19:54.246 { 00:19:54.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.246 "dma_device_type": 2 00:19:54.246 } 00:19:54.246 ], 00:19:54.246 "driver_specific": {} 00:19:54.246 }' 00:19:54.246 07:55:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:54.506 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.766 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.766 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:54.766 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:54.766 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:54.766 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.026 "name": "BaseBdev4", 00:19:55.026 "aliases": [ 00:19:55.026 "55bbf182-8872-435f-a4b5-dcfc3389ebf1" 00:19:55.026 ], 00:19:55.026 "product_name": "Malloc disk", 00:19:55.026 "block_size": 512, 00:19:55.026 "num_blocks": 65536, 00:19:55.026 "uuid": "55bbf182-8872-435f-a4b5-dcfc3389ebf1", 00:19:55.026 "assigned_rate_limits": { 00:19:55.026 "rw_ios_per_sec": 0, 00:19:55.026 "rw_mbytes_per_sec": 0, 00:19:55.026 "r_mbytes_per_sec": 0, 00:19:55.026 "w_mbytes_per_sec": 0 00:19:55.026 }, 00:19:55.026 "claimed": true, 00:19:55.026 "claim_type": "exclusive_write", 00:19:55.026 "zoned": false, 00:19:55.026 "supported_io_types": { 00:19:55.026 "read": true, 00:19:55.026 "write": true, 00:19:55.026 "unmap": true, 00:19:55.026 "flush": true, 00:19:55.026 "reset": true, 00:19:55.026 "nvme_admin": false, 00:19:55.026 "nvme_io": false, 00:19:55.026 "nvme_io_md": false, 00:19:55.026 "write_zeroes": true, 00:19:55.026 "zcopy": true, 00:19:55.026 "get_zone_info": false, 00:19:55.026 "zone_management": false, 00:19:55.026 "zone_append": false, 00:19:55.026 "compare": false, 00:19:55.026 "compare_and_write": false, 00:19:55.026 "abort": true, 00:19:55.026 "seek_hole": false, 00:19:55.026 "seek_data": false, 00:19:55.026 "copy": true, 00:19:55.026 "nvme_iov_md": false 00:19:55.026 }, 00:19:55.026 "memory_domains": [ 00:19:55.026 { 00:19:55.026 "dma_device_id": "system", 00:19:55.026 "dma_device_type": 1 00:19:55.026 }, 00:19:55.026 { 00:19:55.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.026 "dma_device_type": 2 00:19:55.026 } 00:19:55.026 ], 00:19:55.026 "driver_specific": {} 00:19:55.026 }' 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.026 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.286 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.286 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.286 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.286 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.286 07:55:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:55.547 [2024-07-15 07:55:40.057839] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:55.547 "name": "Existed_Raid", 00:19:55.547 "uuid": "10781b86-cdb5-4c88-9881-d67e4dcd1ad1", 00:19:55.547 "strip_size_kb": 0, 00:19:55.547 "state": "online", 00:19:55.547 "raid_level": "raid1", 00:19:55.547 "superblock": true, 00:19:55.547 "num_base_bdevs": 4, 00:19:55.547 "num_base_bdevs_discovered": 3, 00:19:55.547 "num_base_bdevs_operational": 3, 00:19:55.547 "base_bdevs_list": [ 00:19:55.547 { 00:19:55.547 "name": null, 00:19:55.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:55.547 "is_configured": false, 00:19:55.547 "data_offset": 2048, 00:19:55.547 "data_size": 63488 00:19:55.547 }, 00:19:55.547 { 00:19:55.547 "name": "BaseBdev2", 00:19:55.547 "uuid": "827cfeeb-42cb-41a4-b9f9-472df378e217", 00:19:55.547 "is_configured": true, 00:19:55.547 "data_offset": 2048, 00:19:55.547 "data_size": 63488 00:19:55.547 }, 00:19:55.547 { 00:19:55.547 "name": "BaseBdev3", 00:19:55.547 "uuid": "087bfbd7-b198-4039-9fda-25ce6df52f7a", 00:19:55.547 "is_configured": true, 00:19:55.547 "data_offset": 2048, 00:19:55.547 "data_size": 63488 00:19:55.547 }, 00:19:55.547 { 00:19:55.547 "name": "BaseBdev4", 00:19:55.547 "uuid": "55bbf182-8872-435f-a4b5-dcfc3389ebf1", 00:19:55.547 "is_configured": true, 00:19:55.547 "data_offset": 2048, 00:19:55.547 "data_size": 63488 00:19:55.547 } 00:19:55.547 ] 00:19:55.547 }' 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:55.547 07:55:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:56.117 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:56.117 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:56.117 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.117 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:56.377 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:56.377 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:56.377 07:55:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:56.636 [2024-07-15 07:55:41.172647] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:56.636 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:56.636 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:56.636 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.636 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:56.636 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:56.636 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:56.636 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:56.896 [2024-07-15 07:55:41.559514] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:56.896 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:56.896 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:56.896 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.896 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:57.155 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:57.155 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:57.155 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:57.414 [2024-07-15 07:55:41.946308] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:57.414 [2024-07-15 07:55:41.946371] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:57.414 [2024-07-15 07:55:41.952340] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:57.414 [2024-07-15 07:55:41.952365] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:57.414 [2024-07-15 07:55:41.952371] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2063fc0 name Existed_Raid, state offline 00:19:57.414 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:57.414 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:57.414 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.414 07:55:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:57.414 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:57.414 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:57.414 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:57.414 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:57.414 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:57.414 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:57.673 BaseBdev2 00:19:57.673 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:57.674 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:57.674 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:57.674 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:57.674 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:57.674 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:57.674 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:57.933 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:58.193 [ 00:19:58.193 { 00:19:58.193 "name": "BaseBdev2", 00:19:58.193 "aliases": [ 00:19:58.193 "47d67c32-8653-454e-838f-06e66037933c" 00:19:58.193 ], 00:19:58.193 "product_name": "Malloc disk", 00:19:58.193 "block_size": 512, 00:19:58.193 "num_blocks": 65536, 00:19:58.193 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:19:58.193 "assigned_rate_limits": { 00:19:58.193 "rw_ios_per_sec": 0, 00:19:58.193 "rw_mbytes_per_sec": 0, 00:19:58.193 "r_mbytes_per_sec": 0, 00:19:58.193 "w_mbytes_per_sec": 0 00:19:58.193 }, 00:19:58.193 "claimed": false, 00:19:58.193 "zoned": false, 00:19:58.193 "supported_io_types": { 00:19:58.193 "read": true, 00:19:58.193 "write": true, 00:19:58.193 "unmap": true, 00:19:58.193 "flush": true, 00:19:58.193 "reset": true, 00:19:58.193 "nvme_admin": false, 00:19:58.193 "nvme_io": false, 00:19:58.193 "nvme_io_md": false, 00:19:58.193 "write_zeroes": true, 00:19:58.193 "zcopy": true, 00:19:58.193 "get_zone_info": false, 00:19:58.193 "zone_management": false, 00:19:58.193 "zone_append": false, 00:19:58.193 "compare": false, 00:19:58.193 "compare_and_write": false, 00:19:58.193 "abort": true, 00:19:58.193 "seek_hole": false, 00:19:58.193 "seek_data": false, 00:19:58.193 "copy": true, 00:19:58.193 "nvme_iov_md": false 00:19:58.193 }, 00:19:58.193 "memory_domains": [ 00:19:58.193 { 00:19:58.193 "dma_device_id": "system", 00:19:58.193 "dma_device_type": 1 00:19:58.193 }, 00:19:58.193 { 00:19:58.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:58.193 "dma_device_type": 2 00:19:58.193 } 00:19:58.193 ], 00:19:58.193 "driver_specific": {} 00:19:58.193 } 00:19:58.193 ] 00:19:58.193 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:58.193 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:58.193 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:58.194 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:58.194 BaseBdev3 00:19:58.194 07:55:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:58.194 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:58.194 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:58.194 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:58.194 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:58.194 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:58.194 07:55:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:58.454 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:58.713 [ 00:19:58.713 { 00:19:58.713 "name": "BaseBdev3", 00:19:58.713 "aliases": [ 00:19:58.713 "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84" 00:19:58.713 ], 00:19:58.713 "product_name": "Malloc disk", 00:19:58.713 "block_size": 512, 00:19:58.713 "num_blocks": 65536, 00:19:58.713 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:19:58.713 "assigned_rate_limits": { 00:19:58.713 "rw_ios_per_sec": 0, 00:19:58.713 "rw_mbytes_per_sec": 0, 00:19:58.713 "r_mbytes_per_sec": 0, 00:19:58.713 "w_mbytes_per_sec": 0 00:19:58.713 }, 00:19:58.713 "claimed": false, 00:19:58.713 "zoned": false, 00:19:58.713 "supported_io_types": { 00:19:58.713 "read": true, 00:19:58.713 "write": true, 00:19:58.713 "unmap": true, 00:19:58.713 "flush": true, 00:19:58.713 "reset": true, 00:19:58.713 "nvme_admin": false, 00:19:58.713 "nvme_io": false, 00:19:58.713 "nvme_io_md": false, 00:19:58.713 "write_zeroes": true, 00:19:58.713 "zcopy": true, 00:19:58.713 "get_zone_info": false, 00:19:58.713 "zone_management": false, 00:19:58.713 "zone_append": false, 00:19:58.713 "compare": false, 00:19:58.713 "compare_and_write": false, 00:19:58.713 "abort": true, 00:19:58.713 "seek_hole": false, 00:19:58.713 "seek_data": false, 00:19:58.713 "copy": true, 00:19:58.713 "nvme_iov_md": false 00:19:58.713 }, 00:19:58.713 "memory_domains": [ 00:19:58.713 { 00:19:58.713 "dma_device_id": "system", 00:19:58.713 "dma_device_type": 1 00:19:58.713 }, 00:19:58.713 { 00:19:58.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:58.713 "dma_device_type": 2 00:19:58.713 } 00:19:58.713 ], 00:19:58.713 "driver_specific": {} 00:19:58.713 } 00:19:58.713 ] 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:58.713 BaseBdev4 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:58.713 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:58.974 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:59.234 [ 00:19:59.234 { 00:19:59.234 "name": "BaseBdev4", 00:19:59.234 "aliases": [ 00:19:59.234 "4cd6453c-70f7-4d48-b0b0-1630c8b61862" 00:19:59.234 ], 00:19:59.234 "product_name": "Malloc disk", 00:19:59.234 "block_size": 512, 00:19:59.234 "num_blocks": 65536, 00:19:59.234 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:19:59.234 "assigned_rate_limits": { 00:19:59.234 "rw_ios_per_sec": 0, 00:19:59.234 "rw_mbytes_per_sec": 0, 00:19:59.234 "r_mbytes_per_sec": 0, 00:19:59.234 "w_mbytes_per_sec": 0 00:19:59.234 }, 00:19:59.234 "claimed": false, 00:19:59.234 "zoned": false, 00:19:59.234 "supported_io_types": { 00:19:59.234 "read": true, 00:19:59.234 "write": true, 00:19:59.234 "unmap": true, 00:19:59.234 "flush": true, 00:19:59.234 "reset": true, 00:19:59.234 "nvme_admin": false, 00:19:59.234 "nvme_io": false, 00:19:59.234 "nvme_io_md": false, 00:19:59.234 "write_zeroes": true, 00:19:59.234 "zcopy": true, 00:19:59.234 "get_zone_info": false, 00:19:59.234 "zone_management": false, 00:19:59.234 "zone_append": false, 00:19:59.234 "compare": false, 00:19:59.234 "compare_and_write": false, 00:19:59.234 "abort": true, 00:19:59.234 "seek_hole": false, 00:19:59.234 "seek_data": false, 00:19:59.234 "copy": true, 00:19:59.234 "nvme_iov_md": false 00:19:59.234 }, 00:19:59.234 "memory_domains": [ 00:19:59.234 { 00:19:59.234 "dma_device_id": "system", 00:19:59.234 "dma_device_type": 1 00:19:59.234 }, 00:19:59.234 { 00:19:59.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.234 "dma_device_type": 2 00:19:59.234 } 00:19:59.234 ], 00:19:59.234 "driver_specific": {} 00:19:59.234 } 00:19:59.234 ] 00:19:59.234 07:55:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:59.234 07:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:59.234 07:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:59.234 07:55:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:59.494 [2024-07-15 07:55:44.005481] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:59.494 [2024-07-15 07:55:44.005512] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:59.494 [2024-07-15 07:55:44.005526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:59.494 [2024-07-15 07:55:44.006560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:59.494 [2024-07-15 07:55:44.006593] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:59.494 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.494 "name": "Existed_Raid", 00:19:59.494 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:19:59.494 "strip_size_kb": 0, 00:19:59.494 "state": "configuring", 00:19:59.494 "raid_level": "raid1", 00:19:59.494 "superblock": true, 00:19:59.494 "num_base_bdevs": 4, 00:19:59.494 "num_base_bdevs_discovered": 3, 00:19:59.494 "num_base_bdevs_operational": 4, 00:19:59.494 "base_bdevs_list": [ 00:19:59.494 { 00:19:59.494 "name": "BaseBdev1", 00:19:59.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.494 "is_configured": false, 00:19:59.494 "data_offset": 0, 00:19:59.494 "data_size": 0 00:19:59.494 }, 00:19:59.494 { 00:19:59.494 "name": "BaseBdev2", 00:19:59.494 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:19:59.494 "is_configured": true, 00:19:59.494 "data_offset": 2048, 00:19:59.494 "data_size": 63488 00:19:59.494 }, 00:19:59.494 { 00:19:59.494 "name": "BaseBdev3", 00:19:59.494 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:19:59.494 "is_configured": true, 00:19:59.494 "data_offset": 2048, 00:19:59.494 "data_size": 63488 00:19:59.495 }, 00:19:59.495 { 00:19:59.495 "name": "BaseBdev4", 00:19:59.495 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:19:59.495 "is_configured": true, 00:19:59.495 "data_offset": 2048, 00:19:59.495 "data_size": 63488 00:19:59.495 } 00:19:59.495 ] 00:19:59.495 }' 00:19:59.495 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.495 07:55:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:00.064 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:00.324 [2024-07-15 07:55:44.943833] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.324 07:55:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:00.584 07:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:00.584 "name": "Existed_Raid", 00:20:00.584 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:00.584 "strip_size_kb": 0, 00:20:00.584 "state": "configuring", 00:20:00.584 "raid_level": "raid1", 00:20:00.584 "superblock": true, 00:20:00.584 "num_base_bdevs": 4, 00:20:00.584 "num_base_bdevs_discovered": 2, 00:20:00.584 "num_base_bdevs_operational": 4, 00:20:00.584 "base_bdevs_list": [ 00:20:00.584 { 00:20:00.584 "name": "BaseBdev1", 00:20:00.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.584 "is_configured": false, 00:20:00.584 "data_offset": 0, 00:20:00.584 "data_size": 0 00:20:00.584 }, 00:20:00.584 { 00:20:00.584 "name": null, 00:20:00.584 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:00.584 "is_configured": false, 00:20:00.584 "data_offset": 2048, 00:20:00.584 "data_size": 63488 00:20:00.584 }, 00:20:00.584 { 00:20:00.584 "name": "BaseBdev3", 00:20:00.584 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:00.584 "is_configured": true, 00:20:00.584 "data_offset": 2048, 00:20:00.584 "data_size": 63488 00:20:00.584 }, 00:20:00.584 { 00:20:00.584 "name": "BaseBdev4", 00:20:00.584 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:00.584 "is_configured": true, 00:20:00.584 "data_offset": 2048, 00:20:00.584 "data_size": 63488 00:20:00.584 } 00:20:00.584 ] 00:20:00.584 }' 00:20:00.584 07:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:00.584 07:55:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:01.155 07:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.155 07:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:01.415 07:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:01.415 07:55:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:01.415 [2024-07-15 07:55:46.095626] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:01.415 BaseBdev1 00:20:01.415 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:01.415 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:01.415 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:01.415 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:01.415 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:01.415 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:01.415 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:01.676 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:01.936 [ 00:20:01.937 { 00:20:01.937 "name": "BaseBdev1", 00:20:01.937 "aliases": [ 00:20:01.937 "daaff3ca-8725-4bcc-949e-7e59b8557319" 00:20:01.937 ], 00:20:01.937 "product_name": "Malloc disk", 00:20:01.937 "block_size": 512, 00:20:01.937 "num_blocks": 65536, 00:20:01.937 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:01.937 "assigned_rate_limits": { 00:20:01.937 "rw_ios_per_sec": 0, 00:20:01.937 "rw_mbytes_per_sec": 0, 00:20:01.937 "r_mbytes_per_sec": 0, 00:20:01.937 "w_mbytes_per_sec": 0 00:20:01.937 }, 00:20:01.937 "claimed": true, 00:20:01.937 "claim_type": "exclusive_write", 00:20:01.937 "zoned": false, 00:20:01.937 "supported_io_types": { 00:20:01.937 "read": true, 00:20:01.937 "write": true, 00:20:01.937 "unmap": true, 00:20:01.937 "flush": true, 00:20:01.937 "reset": true, 00:20:01.937 "nvme_admin": false, 00:20:01.937 "nvme_io": false, 00:20:01.937 "nvme_io_md": false, 00:20:01.937 "write_zeroes": true, 00:20:01.937 "zcopy": true, 00:20:01.937 "get_zone_info": false, 00:20:01.937 "zone_management": false, 00:20:01.937 "zone_append": false, 00:20:01.937 "compare": false, 00:20:01.937 "compare_and_write": false, 00:20:01.937 "abort": true, 00:20:01.937 "seek_hole": false, 00:20:01.937 "seek_data": false, 00:20:01.937 "copy": true, 00:20:01.937 "nvme_iov_md": false 00:20:01.937 }, 00:20:01.937 "memory_domains": [ 00:20:01.937 { 00:20:01.937 "dma_device_id": "system", 00:20:01.937 "dma_device_type": 1 00:20:01.937 }, 00:20:01.937 { 00:20:01.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.937 "dma_device_type": 2 00:20:01.937 } 00:20:01.937 ], 00:20:01.937 "driver_specific": {} 00:20:01.937 } 00:20:01.937 ] 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:01.937 "name": "Existed_Raid", 00:20:01.937 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:01.937 "strip_size_kb": 0, 00:20:01.937 "state": "configuring", 00:20:01.937 "raid_level": "raid1", 00:20:01.937 "superblock": true, 00:20:01.937 "num_base_bdevs": 4, 00:20:01.937 "num_base_bdevs_discovered": 3, 00:20:01.937 "num_base_bdevs_operational": 4, 00:20:01.937 "base_bdevs_list": [ 00:20:01.937 { 00:20:01.937 "name": "BaseBdev1", 00:20:01.937 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:01.937 "is_configured": true, 00:20:01.937 "data_offset": 2048, 00:20:01.937 "data_size": 63488 00:20:01.937 }, 00:20:01.937 { 00:20:01.937 "name": null, 00:20:01.937 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:01.937 "is_configured": false, 00:20:01.937 "data_offset": 2048, 00:20:01.937 "data_size": 63488 00:20:01.937 }, 00:20:01.937 { 00:20:01.937 "name": "BaseBdev3", 00:20:01.937 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:01.937 "is_configured": true, 00:20:01.937 "data_offset": 2048, 00:20:01.937 "data_size": 63488 00:20:01.937 }, 00:20:01.937 { 00:20:01.937 "name": "BaseBdev4", 00:20:01.937 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:01.937 "is_configured": true, 00:20:01.937 "data_offset": 2048, 00:20:01.937 "data_size": 63488 00:20:01.937 } 00:20:01.937 ] 00:20:01.937 }' 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:01.937 07:55:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:02.508 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.508 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:02.826 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:02.826 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:03.114 [2024-07-15 07:55:47.571385] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.114 "name": "Existed_Raid", 00:20:03.114 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:03.114 "strip_size_kb": 0, 00:20:03.114 "state": "configuring", 00:20:03.114 "raid_level": "raid1", 00:20:03.114 "superblock": true, 00:20:03.114 "num_base_bdevs": 4, 00:20:03.114 "num_base_bdevs_discovered": 2, 00:20:03.114 "num_base_bdevs_operational": 4, 00:20:03.114 "base_bdevs_list": [ 00:20:03.114 { 00:20:03.114 "name": "BaseBdev1", 00:20:03.114 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:03.114 "is_configured": true, 00:20:03.114 "data_offset": 2048, 00:20:03.114 "data_size": 63488 00:20:03.114 }, 00:20:03.114 { 00:20:03.114 "name": null, 00:20:03.114 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:03.114 "is_configured": false, 00:20:03.114 "data_offset": 2048, 00:20:03.114 "data_size": 63488 00:20:03.114 }, 00:20:03.114 { 00:20:03.114 "name": null, 00:20:03.114 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:03.114 "is_configured": false, 00:20:03.114 "data_offset": 2048, 00:20:03.114 "data_size": 63488 00:20:03.114 }, 00:20:03.114 { 00:20:03.114 "name": "BaseBdev4", 00:20:03.114 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:03.114 "is_configured": true, 00:20:03.114 "data_offset": 2048, 00:20:03.114 "data_size": 63488 00:20:03.114 } 00:20:03.114 ] 00:20:03.114 }' 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.114 07:55:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:03.685 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.685 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:03.946 [2024-07-15 07:55:48.682211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.946 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.206 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.206 "name": "Existed_Raid", 00:20:04.206 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:04.206 "strip_size_kb": 0, 00:20:04.206 "state": "configuring", 00:20:04.206 "raid_level": "raid1", 00:20:04.206 "superblock": true, 00:20:04.206 "num_base_bdevs": 4, 00:20:04.206 "num_base_bdevs_discovered": 3, 00:20:04.206 "num_base_bdevs_operational": 4, 00:20:04.206 "base_bdevs_list": [ 00:20:04.206 { 00:20:04.206 "name": "BaseBdev1", 00:20:04.206 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:04.206 "is_configured": true, 00:20:04.206 "data_offset": 2048, 00:20:04.206 "data_size": 63488 00:20:04.206 }, 00:20:04.206 { 00:20:04.206 "name": null, 00:20:04.206 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:04.206 "is_configured": false, 00:20:04.206 "data_offset": 2048, 00:20:04.206 "data_size": 63488 00:20:04.206 }, 00:20:04.206 { 00:20:04.206 "name": "BaseBdev3", 00:20:04.206 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:04.206 "is_configured": true, 00:20:04.206 "data_offset": 2048, 00:20:04.206 "data_size": 63488 00:20:04.206 }, 00:20:04.206 { 00:20:04.206 "name": "BaseBdev4", 00:20:04.206 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:04.206 "is_configured": true, 00:20:04.206 "data_offset": 2048, 00:20:04.206 "data_size": 63488 00:20:04.206 } 00:20:04.206 ] 00:20:04.206 }' 00:20:04.206 07:55:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.206 07:55:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:04.776 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:04.776 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.037 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:05.037 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:05.037 [2024-07-15 07:55:49.785006] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.297 07:55:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:05.297 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:05.297 "name": "Existed_Raid", 00:20:05.297 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:05.297 "strip_size_kb": 0, 00:20:05.297 "state": "configuring", 00:20:05.297 "raid_level": "raid1", 00:20:05.297 "superblock": true, 00:20:05.297 "num_base_bdevs": 4, 00:20:05.297 "num_base_bdevs_discovered": 2, 00:20:05.297 "num_base_bdevs_operational": 4, 00:20:05.297 "base_bdevs_list": [ 00:20:05.297 { 00:20:05.297 "name": null, 00:20:05.297 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:05.297 "is_configured": false, 00:20:05.297 "data_offset": 2048, 00:20:05.297 "data_size": 63488 00:20:05.297 }, 00:20:05.297 { 00:20:05.297 "name": null, 00:20:05.297 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:05.297 "is_configured": false, 00:20:05.297 "data_offset": 2048, 00:20:05.297 "data_size": 63488 00:20:05.297 }, 00:20:05.297 { 00:20:05.297 "name": "BaseBdev3", 00:20:05.297 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:05.297 "is_configured": true, 00:20:05.297 "data_offset": 2048, 00:20:05.297 "data_size": 63488 00:20:05.297 }, 00:20:05.297 { 00:20:05.297 "name": "BaseBdev4", 00:20:05.297 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:05.297 "is_configured": true, 00:20:05.297 "data_offset": 2048, 00:20:05.297 "data_size": 63488 00:20:05.297 } 00:20:05.297 ] 00:20:05.297 }' 00:20:05.297 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:05.297 07:55:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.868 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.868 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:06.128 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:06.128 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:06.387 [2024-07-15 07:55:50.913669] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:06.387 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:06.387 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:06.387 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:06.387 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:06.388 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:06.388 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.388 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.388 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.388 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.388 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.388 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:06.388 07:55:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.388 07:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.388 "name": "Existed_Raid", 00:20:06.388 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:06.388 "strip_size_kb": 0, 00:20:06.388 "state": "configuring", 00:20:06.388 "raid_level": "raid1", 00:20:06.388 "superblock": true, 00:20:06.388 "num_base_bdevs": 4, 00:20:06.388 "num_base_bdevs_discovered": 3, 00:20:06.388 "num_base_bdevs_operational": 4, 00:20:06.388 "base_bdevs_list": [ 00:20:06.388 { 00:20:06.388 "name": null, 00:20:06.388 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:06.388 "is_configured": false, 00:20:06.388 "data_offset": 2048, 00:20:06.388 "data_size": 63488 00:20:06.388 }, 00:20:06.388 { 00:20:06.388 "name": "BaseBdev2", 00:20:06.388 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:06.388 "is_configured": true, 00:20:06.388 "data_offset": 2048, 00:20:06.388 "data_size": 63488 00:20:06.388 }, 00:20:06.388 { 00:20:06.388 "name": "BaseBdev3", 00:20:06.388 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:06.388 "is_configured": true, 00:20:06.388 "data_offset": 2048, 00:20:06.388 "data_size": 63488 00:20:06.388 }, 00:20:06.388 { 00:20:06.388 "name": "BaseBdev4", 00:20:06.388 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:06.388 "is_configured": true, 00:20:06.388 "data_offset": 2048, 00:20:06.388 "data_size": 63488 00:20:06.388 } 00:20:06.388 ] 00:20:06.388 }' 00:20:06.388 07:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.388 07:55:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:06.955 07:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.955 07:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:07.214 07:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:07.214 07:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.214 07:55:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:07.474 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u daaff3ca-8725-4bcc-949e-7e59b8557319 00:20:07.474 [2024-07-15 07:55:52.222039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:07.474 [2024-07-15 07:55:52.222163] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2066080 00:20:07.474 [2024-07-15 07:55:52.222170] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:07.474 [2024-07-15 07:55:52.222307] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205b0b0 00:20:07.474 [2024-07-15 07:55:52.222402] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2066080 00:20:07.474 [2024-07-15 07:55:52.222412] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2066080 00:20:07.474 [2024-07-15 07:55:52.222479] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:07.474 NewBaseBdev 00:20:07.735 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:07.735 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:07.735 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:07.735 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:07.735 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:07.735 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:07.735 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:07.735 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:07.995 [ 00:20:07.995 { 00:20:07.995 "name": "NewBaseBdev", 00:20:07.995 "aliases": [ 00:20:07.995 "daaff3ca-8725-4bcc-949e-7e59b8557319" 00:20:07.995 ], 00:20:07.995 "product_name": "Malloc disk", 00:20:07.995 "block_size": 512, 00:20:07.995 "num_blocks": 65536, 00:20:07.995 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:07.995 "assigned_rate_limits": { 00:20:07.995 "rw_ios_per_sec": 0, 00:20:07.995 "rw_mbytes_per_sec": 0, 00:20:07.995 "r_mbytes_per_sec": 0, 00:20:07.995 "w_mbytes_per_sec": 0 00:20:07.995 }, 00:20:07.995 "claimed": true, 00:20:07.995 "claim_type": "exclusive_write", 00:20:07.995 "zoned": false, 00:20:07.995 "supported_io_types": { 00:20:07.995 "read": true, 00:20:07.995 "write": true, 00:20:07.995 "unmap": true, 00:20:07.995 "flush": true, 00:20:07.995 "reset": true, 00:20:07.995 "nvme_admin": false, 00:20:07.995 "nvme_io": false, 00:20:07.995 "nvme_io_md": false, 00:20:07.995 "write_zeroes": true, 00:20:07.996 "zcopy": true, 00:20:07.996 "get_zone_info": false, 00:20:07.996 "zone_management": false, 00:20:07.996 "zone_append": false, 00:20:07.996 "compare": false, 00:20:07.996 "compare_and_write": false, 00:20:07.996 "abort": true, 00:20:07.996 "seek_hole": false, 00:20:07.996 "seek_data": false, 00:20:07.996 "copy": true, 00:20:07.996 "nvme_iov_md": false 00:20:07.996 }, 00:20:07.996 "memory_domains": [ 00:20:07.996 { 00:20:07.996 "dma_device_id": "system", 00:20:07.996 "dma_device_type": 1 00:20:07.996 }, 00:20:07.996 { 00:20:07.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.996 "dma_device_type": 2 00:20:07.996 } 00:20:07.996 ], 00:20:07.996 "driver_specific": {} 00:20:07.996 } 00:20:07.996 ] 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.996 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:08.256 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.256 "name": "Existed_Raid", 00:20:08.256 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:08.256 "strip_size_kb": 0, 00:20:08.256 "state": "online", 00:20:08.256 "raid_level": "raid1", 00:20:08.256 "superblock": true, 00:20:08.256 "num_base_bdevs": 4, 00:20:08.256 "num_base_bdevs_discovered": 4, 00:20:08.256 "num_base_bdevs_operational": 4, 00:20:08.256 "base_bdevs_list": [ 00:20:08.256 { 00:20:08.256 "name": "NewBaseBdev", 00:20:08.256 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:08.256 "is_configured": true, 00:20:08.256 "data_offset": 2048, 00:20:08.256 "data_size": 63488 00:20:08.256 }, 00:20:08.256 { 00:20:08.256 "name": "BaseBdev2", 00:20:08.256 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:08.256 "is_configured": true, 00:20:08.256 "data_offset": 2048, 00:20:08.256 "data_size": 63488 00:20:08.256 }, 00:20:08.256 { 00:20:08.256 "name": "BaseBdev3", 00:20:08.256 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:08.256 "is_configured": true, 00:20:08.256 "data_offset": 2048, 00:20:08.256 "data_size": 63488 00:20:08.256 }, 00:20:08.256 { 00:20:08.256 "name": "BaseBdev4", 00:20:08.256 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:08.256 "is_configured": true, 00:20:08.256 "data_offset": 2048, 00:20:08.256 "data_size": 63488 00:20:08.256 } 00:20:08.256 ] 00:20:08.256 }' 00:20:08.256 07:55:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.256 07:55:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:08.826 [2024-07-15 07:55:53.505545] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:08.826 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:08.826 "name": "Existed_Raid", 00:20:08.826 "aliases": [ 00:20:08.826 "165cb207-7ced-4560-ad21-4c4224f34c79" 00:20:08.826 ], 00:20:08.826 "product_name": "Raid Volume", 00:20:08.826 "block_size": 512, 00:20:08.826 "num_blocks": 63488, 00:20:08.826 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:08.826 "assigned_rate_limits": { 00:20:08.826 "rw_ios_per_sec": 0, 00:20:08.826 "rw_mbytes_per_sec": 0, 00:20:08.826 "r_mbytes_per_sec": 0, 00:20:08.826 "w_mbytes_per_sec": 0 00:20:08.826 }, 00:20:08.826 "claimed": false, 00:20:08.826 "zoned": false, 00:20:08.826 "supported_io_types": { 00:20:08.826 "read": true, 00:20:08.826 "write": true, 00:20:08.826 "unmap": false, 00:20:08.826 "flush": false, 00:20:08.826 "reset": true, 00:20:08.826 "nvme_admin": false, 00:20:08.826 "nvme_io": false, 00:20:08.826 "nvme_io_md": false, 00:20:08.826 "write_zeroes": true, 00:20:08.826 "zcopy": false, 00:20:08.826 "get_zone_info": false, 00:20:08.826 "zone_management": false, 00:20:08.826 "zone_append": false, 00:20:08.826 "compare": false, 00:20:08.826 "compare_and_write": false, 00:20:08.826 "abort": false, 00:20:08.826 "seek_hole": false, 00:20:08.826 "seek_data": false, 00:20:08.826 "copy": false, 00:20:08.826 "nvme_iov_md": false 00:20:08.826 }, 00:20:08.826 "memory_domains": [ 00:20:08.826 { 00:20:08.826 "dma_device_id": "system", 00:20:08.826 "dma_device_type": 1 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.826 "dma_device_type": 2 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "dma_device_id": "system", 00:20:08.826 "dma_device_type": 1 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.826 "dma_device_type": 2 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "dma_device_id": "system", 00:20:08.826 "dma_device_type": 1 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.826 "dma_device_type": 2 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "dma_device_id": "system", 00:20:08.826 "dma_device_type": 1 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.826 "dma_device_type": 2 00:20:08.826 } 00:20:08.826 ], 00:20:08.826 "driver_specific": { 00:20:08.826 "raid": { 00:20:08.826 "uuid": "165cb207-7ced-4560-ad21-4c4224f34c79", 00:20:08.826 "strip_size_kb": 0, 00:20:08.826 "state": "online", 00:20:08.826 "raid_level": "raid1", 00:20:08.826 "superblock": true, 00:20:08.826 "num_base_bdevs": 4, 00:20:08.826 "num_base_bdevs_discovered": 4, 00:20:08.826 "num_base_bdevs_operational": 4, 00:20:08.826 "base_bdevs_list": [ 00:20:08.826 { 00:20:08.826 "name": "NewBaseBdev", 00:20:08.826 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:08.826 "is_configured": true, 00:20:08.826 "data_offset": 2048, 00:20:08.826 "data_size": 63488 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "name": "BaseBdev2", 00:20:08.826 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:08.826 "is_configured": true, 00:20:08.826 "data_offset": 2048, 00:20:08.826 "data_size": 63488 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "name": "BaseBdev3", 00:20:08.826 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:08.826 "is_configured": true, 00:20:08.826 "data_offset": 2048, 00:20:08.826 "data_size": 63488 00:20:08.826 }, 00:20:08.826 { 00:20:08.826 "name": "BaseBdev4", 00:20:08.826 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:08.826 "is_configured": true, 00:20:08.826 "data_offset": 2048, 00:20:08.826 "data_size": 63488 00:20:08.826 } 00:20:08.826 ] 00:20:08.826 } 00:20:08.826 } 00:20:08.826 }' 00:20:08.827 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:08.827 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:08.827 BaseBdev2 00:20:08.827 BaseBdev3 00:20:08.827 BaseBdev4' 00:20:08.827 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:08.827 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:08.827 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:09.086 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:09.086 "name": "NewBaseBdev", 00:20:09.086 "aliases": [ 00:20:09.086 "daaff3ca-8725-4bcc-949e-7e59b8557319" 00:20:09.086 ], 00:20:09.086 "product_name": "Malloc disk", 00:20:09.086 "block_size": 512, 00:20:09.086 "num_blocks": 65536, 00:20:09.086 "uuid": "daaff3ca-8725-4bcc-949e-7e59b8557319", 00:20:09.086 "assigned_rate_limits": { 00:20:09.086 "rw_ios_per_sec": 0, 00:20:09.086 "rw_mbytes_per_sec": 0, 00:20:09.086 "r_mbytes_per_sec": 0, 00:20:09.086 "w_mbytes_per_sec": 0 00:20:09.086 }, 00:20:09.086 "claimed": true, 00:20:09.086 "claim_type": "exclusive_write", 00:20:09.086 "zoned": false, 00:20:09.086 "supported_io_types": { 00:20:09.086 "read": true, 00:20:09.086 "write": true, 00:20:09.086 "unmap": true, 00:20:09.086 "flush": true, 00:20:09.086 "reset": true, 00:20:09.086 "nvme_admin": false, 00:20:09.086 "nvme_io": false, 00:20:09.086 "nvme_io_md": false, 00:20:09.086 "write_zeroes": true, 00:20:09.086 "zcopy": true, 00:20:09.086 "get_zone_info": false, 00:20:09.086 "zone_management": false, 00:20:09.086 "zone_append": false, 00:20:09.086 "compare": false, 00:20:09.086 "compare_and_write": false, 00:20:09.086 "abort": true, 00:20:09.086 "seek_hole": false, 00:20:09.086 "seek_data": false, 00:20:09.086 "copy": true, 00:20:09.086 "nvme_iov_md": false 00:20:09.086 }, 00:20:09.086 "memory_domains": [ 00:20:09.086 { 00:20:09.086 "dma_device_id": "system", 00:20:09.086 "dma_device_type": 1 00:20:09.086 }, 00:20:09.086 { 00:20:09.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.086 "dma_device_type": 2 00:20:09.086 } 00:20:09.086 ], 00:20:09.086 "driver_specific": {} 00:20:09.086 }' 00:20:09.086 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.086 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.346 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:09.346 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.346 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.346 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:09.346 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.346 07:55:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.346 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:09.346 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.346 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.605 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:09.605 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:09.605 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:09.605 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:09.605 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:09.605 "name": "BaseBdev2", 00:20:09.605 "aliases": [ 00:20:09.605 "47d67c32-8653-454e-838f-06e66037933c" 00:20:09.605 ], 00:20:09.605 "product_name": "Malloc disk", 00:20:09.605 "block_size": 512, 00:20:09.605 "num_blocks": 65536, 00:20:09.605 "uuid": "47d67c32-8653-454e-838f-06e66037933c", 00:20:09.605 "assigned_rate_limits": { 00:20:09.605 "rw_ios_per_sec": 0, 00:20:09.605 "rw_mbytes_per_sec": 0, 00:20:09.605 "r_mbytes_per_sec": 0, 00:20:09.605 "w_mbytes_per_sec": 0 00:20:09.605 }, 00:20:09.605 "claimed": true, 00:20:09.605 "claim_type": "exclusive_write", 00:20:09.605 "zoned": false, 00:20:09.605 "supported_io_types": { 00:20:09.605 "read": true, 00:20:09.605 "write": true, 00:20:09.605 "unmap": true, 00:20:09.605 "flush": true, 00:20:09.605 "reset": true, 00:20:09.605 "nvme_admin": false, 00:20:09.605 "nvme_io": false, 00:20:09.605 "nvme_io_md": false, 00:20:09.605 "write_zeroes": true, 00:20:09.605 "zcopy": true, 00:20:09.605 "get_zone_info": false, 00:20:09.605 "zone_management": false, 00:20:09.605 "zone_append": false, 00:20:09.605 "compare": false, 00:20:09.605 "compare_and_write": false, 00:20:09.605 "abort": true, 00:20:09.605 "seek_hole": false, 00:20:09.605 "seek_data": false, 00:20:09.605 "copy": true, 00:20:09.605 "nvme_iov_md": false 00:20:09.605 }, 00:20:09.605 "memory_domains": [ 00:20:09.605 { 00:20:09.605 "dma_device_id": "system", 00:20:09.605 "dma_device_type": 1 00:20:09.605 }, 00:20:09.605 { 00:20:09.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.605 "dma_device_type": 2 00:20:09.605 } 00:20:09.605 ], 00:20:09.605 "driver_specific": {} 00:20:09.605 }' 00:20:09.605 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:09.865 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.124 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.124 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:10.124 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:10.124 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:10.124 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:10.124 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:10.124 "name": "BaseBdev3", 00:20:10.124 "aliases": [ 00:20:10.124 "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84" 00:20:10.124 ], 00:20:10.124 "product_name": "Malloc disk", 00:20:10.124 "block_size": 512, 00:20:10.124 "num_blocks": 65536, 00:20:10.124 "uuid": "638ffa0a-90c4-4f4d-a1ba-eaf402dfcb84", 00:20:10.124 "assigned_rate_limits": { 00:20:10.124 "rw_ios_per_sec": 0, 00:20:10.124 "rw_mbytes_per_sec": 0, 00:20:10.124 "r_mbytes_per_sec": 0, 00:20:10.124 "w_mbytes_per_sec": 0 00:20:10.124 }, 00:20:10.124 "claimed": true, 00:20:10.124 "claim_type": "exclusive_write", 00:20:10.124 "zoned": false, 00:20:10.124 "supported_io_types": { 00:20:10.125 "read": true, 00:20:10.125 "write": true, 00:20:10.125 "unmap": true, 00:20:10.125 "flush": true, 00:20:10.125 "reset": true, 00:20:10.125 "nvme_admin": false, 00:20:10.125 "nvme_io": false, 00:20:10.125 "nvme_io_md": false, 00:20:10.125 "write_zeroes": true, 00:20:10.125 "zcopy": true, 00:20:10.125 "get_zone_info": false, 00:20:10.125 "zone_management": false, 00:20:10.125 "zone_append": false, 00:20:10.125 "compare": false, 00:20:10.125 "compare_and_write": false, 00:20:10.125 "abort": true, 00:20:10.125 "seek_hole": false, 00:20:10.125 "seek_data": false, 00:20:10.125 "copy": true, 00:20:10.125 "nvme_iov_md": false 00:20:10.125 }, 00:20:10.125 "memory_domains": [ 00:20:10.125 { 00:20:10.125 "dma_device_id": "system", 00:20:10.125 "dma_device_type": 1 00:20:10.125 }, 00:20:10.125 { 00:20:10.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.125 "dma_device_type": 2 00:20:10.125 } 00:20:10.125 ], 00:20:10.125 "driver_specific": {} 00:20:10.125 }' 00:20:10.125 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.384 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.384 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:10.384 07:55:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.384 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.384 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:10.384 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.384 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.644 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:10.644 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.644 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.644 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:10.644 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:10.644 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:10.644 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:10.904 "name": "BaseBdev4", 00:20:10.904 "aliases": [ 00:20:10.904 "4cd6453c-70f7-4d48-b0b0-1630c8b61862" 00:20:10.904 ], 00:20:10.904 "product_name": "Malloc disk", 00:20:10.904 "block_size": 512, 00:20:10.904 "num_blocks": 65536, 00:20:10.904 "uuid": "4cd6453c-70f7-4d48-b0b0-1630c8b61862", 00:20:10.904 "assigned_rate_limits": { 00:20:10.904 "rw_ios_per_sec": 0, 00:20:10.904 "rw_mbytes_per_sec": 0, 00:20:10.904 "r_mbytes_per_sec": 0, 00:20:10.904 "w_mbytes_per_sec": 0 00:20:10.904 }, 00:20:10.904 "claimed": true, 00:20:10.904 "claim_type": "exclusive_write", 00:20:10.904 "zoned": false, 00:20:10.904 "supported_io_types": { 00:20:10.904 "read": true, 00:20:10.904 "write": true, 00:20:10.904 "unmap": true, 00:20:10.904 "flush": true, 00:20:10.904 "reset": true, 00:20:10.904 "nvme_admin": false, 00:20:10.904 "nvme_io": false, 00:20:10.904 "nvme_io_md": false, 00:20:10.904 "write_zeroes": true, 00:20:10.904 "zcopy": true, 00:20:10.904 "get_zone_info": false, 00:20:10.904 "zone_management": false, 00:20:10.904 "zone_append": false, 00:20:10.904 "compare": false, 00:20:10.904 "compare_and_write": false, 00:20:10.904 "abort": true, 00:20:10.904 "seek_hole": false, 00:20:10.904 "seek_data": false, 00:20:10.904 "copy": true, 00:20:10.904 "nvme_iov_md": false 00:20:10.904 }, 00:20:10.904 "memory_domains": [ 00:20:10.904 { 00:20:10.904 "dma_device_id": "system", 00:20:10.904 "dma_device_type": 1 00:20:10.904 }, 00:20:10.904 { 00:20:10.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.904 "dma_device_type": 2 00:20:10.904 } 00:20:10.904 ], 00:20:10.904 "driver_specific": {} 00:20:10.904 }' 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.904 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.164 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:11.164 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.164 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.164 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:11.164 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:11.424 [2024-07-15 07:55:55.927502] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:11.424 [2024-07-15 07:55:55.927520] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:11.424 [2024-07-15 07:55:55.927555] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:11.424 [2024-07-15 07:55:55.927765] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:11.424 [2024-07-15 07:55:55.927773] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2066080 name Existed_Raid, state offline 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1692218 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1692218 ']' 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1692218 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1692218 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1692218' 00:20:11.424 killing process with pid 1692218 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1692218 00:20:11.424 [2024-07-15 07:55:55.993724] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:11.424 07:55:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1692218 00:20:11.424 [2024-07-15 07:55:56.014170] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:11.424 07:55:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:11.424 00:20:11.424 real 0m28.325s 00:20:11.424 user 0m53.183s 00:20:11.424 sys 0m4.022s 00:20:11.424 07:55:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:11.424 07:55:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:11.424 ************************************ 00:20:11.424 END TEST raid_state_function_test_sb 00:20:11.424 ************************************ 00:20:11.424 07:55:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:11.424 07:55:56 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:20:11.424 07:55:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:11.424 07:55:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:11.424 07:55:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:11.683 ************************************ 00:20:11.683 START TEST raid_superblock_test 00:20:11.683 ************************************ 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1697547 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1697547 /var/tmp/spdk-raid.sock 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1697547 ']' 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:11.684 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:11.684 07:55:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:11.684 [2024-07-15 07:55:56.282056] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:20:11.684 [2024-07-15 07:55:56.282100] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1697547 ] 00:20:11.684 [2024-07-15 07:55:56.366452] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:11.684 [2024-07-15 07:55:56.428144] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:11.943 [2024-07-15 07:55:56.466853] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:11.943 [2024-07-15 07:55:56.466877] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:12.512 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:12.513 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:12.772 malloc1 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:12.772 [2024-07-15 07:55:57.461291] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:12.772 [2024-07-15 07:55:57.461326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:12.772 [2024-07-15 07:55:57.461338] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18dea20 00:20:12.772 [2024-07-15 07:55:57.461348] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:12.772 [2024-07-15 07:55:57.462694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:12.772 [2024-07-15 07:55:57.462721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:12.772 pt1 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:12.772 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:13.031 malloc2 00:20:13.031 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:13.292 [2024-07-15 07:55:57.848304] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:13.292 [2024-07-15 07:55:57.848331] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:13.292 [2024-07-15 07:55:57.848342] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18df040 00:20:13.292 [2024-07-15 07:55:57.848348] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:13.292 [2024-07-15 07:55:57.849565] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:13.292 [2024-07-15 07:55:57.849584] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:13.292 pt2 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:13.292 07:55:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:13.553 malloc3 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:13.553 [2024-07-15 07:55:58.259230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:13.553 [2024-07-15 07:55:58.259260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:13.553 [2024-07-15 07:55:58.259271] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18df540 00:20:13.553 [2024-07-15 07:55:58.259277] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:13.553 [2024-07-15 07:55:58.260494] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:13.553 [2024-07-15 07:55:58.260512] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:13.553 pt3 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:13.553 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:13.813 malloc4 00:20:13.813 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:14.073 [2024-07-15 07:55:58.638023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:14.073 [2024-07-15 07:55:58.638049] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:14.073 [2024-07-15 07:55:58.638058] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8cd60 00:20:14.073 [2024-07-15 07:55:58.638064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:14.073 [2024-07-15 07:55:58.639274] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:14.073 [2024-07-15 07:55:58.639292] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:14.073 pt4 00:20:14.073 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:14.073 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:14.073 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:14.073 [2024-07-15 07:55:58.826514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:14.073 [2024-07-15 07:55:58.827515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:14.073 [2024-07-15 07:55:58.827554] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:14.073 [2024-07-15 07:55:58.827587] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:14.073 [2024-07-15 07:55:58.827733] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a89e20 00:20:14.073 [2024-07-15 07:55:58.827740] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:14.073 [2024-07-15 07:55:58.827891] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e0000 00:20:14.073 [2024-07-15 07:55:58.828005] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a89e20 00:20:14.073 [2024-07-15 07:55:58.828010] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a89e20 00:20:14.073 [2024-07-15 07:55:58.828078] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.335 07:55:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:14.335 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.335 "name": "raid_bdev1", 00:20:14.335 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:14.335 "strip_size_kb": 0, 00:20:14.335 "state": "online", 00:20:14.335 "raid_level": "raid1", 00:20:14.335 "superblock": true, 00:20:14.335 "num_base_bdevs": 4, 00:20:14.335 "num_base_bdevs_discovered": 4, 00:20:14.335 "num_base_bdevs_operational": 4, 00:20:14.335 "base_bdevs_list": [ 00:20:14.335 { 00:20:14.335 "name": "pt1", 00:20:14.335 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:14.335 "is_configured": true, 00:20:14.335 "data_offset": 2048, 00:20:14.335 "data_size": 63488 00:20:14.335 }, 00:20:14.335 { 00:20:14.335 "name": "pt2", 00:20:14.335 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:14.335 "is_configured": true, 00:20:14.335 "data_offset": 2048, 00:20:14.335 "data_size": 63488 00:20:14.335 }, 00:20:14.335 { 00:20:14.335 "name": "pt3", 00:20:14.336 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:14.336 "is_configured": true, 00:20:14.336 "data_offset": 2048, 00:20:14.336 "data_size": 63488 00:20:14.336 }, 00:20:14.336 { 00:20:14.336 "name": "pt4", 00:20:14.336 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:14.336 "is_configured": true, 00:20:14.336 "data_offset": 2048, 00:20:14.336 "data_size": 63488 00:20:14.336 } 00:20:14.336 ] 00:20:14.336 }' 00:20:14.336 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.336 07:55:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:14.906 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:14.906 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:14.906 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:14.906 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:14.906 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:14.906 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:14.906 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:14.906 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:15.166 [2024-07-15 07:55:59.765109] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:15.166 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:15.166 "name": "raid_bdev1", 00:20:15.166 "aliases": [ 00:20:15.166 "546ab423-1120-4391-88e3-6f52dab2722f" 00:20:15.166 ], 00:20:15.166 "product_name": "Raid Volume", 00:20:15.166 "block_size": 512, 00:20:15.166 "num_blocks": 63488, 00:20:15.166 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:15.166 "assigned_rate_limits": { 00:20:15.166 "rw_ios_per_sec": 0, 00:20:15.166 "rw_mbytes_per_sec": 0, 00:20:15.166 "r_mbytes_per_sec": 0, 00:20:15.166 "w_mbytes_per_sec": 0 00:20:15.166 }, 00:20:15.166 "claimed": false, 00:20:15.166 "zoned": false, 00:20:15.166 "supported_io_types": { 00:20:15.166 "read": true, 00:20:15.166 "write": true, 00:20:15.166 "unmap": false, 00:20:15.166 "flush": false, 00:20:15.166 "reset": true, 00:20:15.166 "nvme_admin": false, 00:20:15.166 "nvme_io": false, 00:20:15.166 "nvme_io_md": false, 00:20:15.166 "write_zeroes": true, 00:20:15.166 "zcopy": false, 00:20:15.166 "get_zone_info": false, 00:20:15.166 "zone_management": false, 00:20:15.166 "zone_append": false, 00:20:15.166 "compare": false, 00:20:15.166 "compare_and_write": false, 00:20:15.166 "abort": false, 00:20:15.166 "seek_hole": false, 00:20:15.166 "seek_data": false, 00:20:15.166 "copy": false, 00:20:15.166 "nvme_iov_md": false 00:20:15.166 }, 00:20:15.166 "memory_domains": [ 00:20:15.166 { 00:20:15.166 "dma_device_id": "system", 00:20:15.166 "dma_device_type": 1 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.166 "dma_device_type": 2 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "dma_device_id": "system", 00:20:15.166 "dma_device_type": 1 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.166 "dma_device_type": 2 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "dma_device_id": "system", 00:20:15.166 "dma_device_type": 1 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.166 "dma_device_type": 2 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "dma_device_id": "system", 00:20:15.166 "dma_device_type": 1 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.166 "dma_device_type": 2 00:20:15.166 } 00:20:15.166 ], 00:20:15.166 "driver_specific": { 00:20:15.166 "raid": { 00:20:15.166 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:15.166 "strip_size_kb": 0, 00:20:15.166 "state": "online", 00:20:15.166 "raid_level": "raid1", 00:20:15.166 "superblock": true, 00:20:15.166 "num_base_bdevs": 4, 00:20:15.166 "num_base_bdevs_discovered": 4, 00:20:15.166 "num_base_bdevs_operational": 4, 00:20:15.166 "base_bdevs_list": [ 00:20:15.166 { 00:20:15.166 "name": "pt1", 00:20:15.166 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:15.166 "is_configured": true, 00:20:15.166 "data_offset": 2048, 00:20:15.166 "data_size": 63488 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "name": "pt2", 00:20:15.166 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:15.166 "is_configured": true, 00:20:15.166 "data_offset": 2048, 00:20:15.166 "data_size": 63488 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "name": "pt3", 00:20:15.166 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:15.166 "is_configured": true, 00:20:15.166 "data_offset": 2048, 00:20:15.166 "data_size": 63488 00:20:15.166 }, 00:20:15.166 { 00:20:15.166 "name": "pt4", 00:20:15.166 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:15.166 "is_configured": true, 00:20:15.167 "data_offset": 2048, 00:20:15.167 "data_size": 63488 00:20:15.167 } 00:20:15.167 ] 00:20:15.167 } 00:20:15.167 } 00:20:15.167 }' 00:20:15.167 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:15.167 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:15.167 pt2 00:20:15.167 pt3 00:20:15.167 pt4' 00:20:15.167 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.167 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:15.167 07:55:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.428 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.428 "name": "pt1", 00:20:15.428 "aliases": [ 00:20:15.428 "00000000-0000-0000-0000-000000000001" 00:20:15.428 ], 00:20:15.428 "product_name": "passthru", 00:20:15.428 "block_size": 512, 00:20:15.428 "num_blocks": 65536, 00:20:15.428 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:15.428 "assigned_rate_limits": { 00:20:15.428 "rw_ios_per_sec": 0, 00:20:15.428 "rw_mbytes_per_sec": 0, 00:20:15.428 "r_mbytes_per_sec": 0, 00:20:15.428 "w_mbytes_per_sec": 0 00:20:15.428 }, 00:20:15.428 "claimed": true, 00:20:15.428 "claim_type": "exclusive_write", 00:20:15.428 "zoned": false, 00:20:15.428 "supported_io_types": { 00:20:15.428 "read": true, 00:20:15.428 "write": true, 00:20:15.428 "unmap": true, 00:20:15.428 "flush": true, 00:20:15.428 "reset": true, 00:20:15.428 "nvme_admin": false, 00:20:15.428 "nvme_io": false, 00:20:15.428 "nvme_io_md": false, 00:20:15.428 "write_zeroes": true, 00:20:15.428 "zcopy": true, 00:20:15.429 "get_zone_info": false, 00:20:15.429 "zone_management": false, 00:20:15.429 "zone_append": false, 00:20:15.429 "compare": false, 00:20:15.429 "compare_and_write": false, 00:20:15.429 "abort": true, 00:20:15.429 "seek_hole": false, 00:20:15.429 "seek_data": false, 00:20:15.429 "copy": true, 00:20:15.429 "nvme_iov_md": false 00:20:15.429 }, 00:20:15.429 "memory_domains": [ 00:20:15.429 { 00:20:15.429 "dma_device_id": "system", 00:20:15.429 "dma_device_type": 1 00:20:15.429 }, 00:20:15.429 { 00:20:15.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.429 "dma_device_type": 2 00:20:15.429 } 00:20:15.429 ], 00:20:15.429 "driver_specific": { 00:20:15.429 "passthru": { 00:20:15.429 "name": "pt1", 00:20:15.429 "base_bdev_name": "malloc1" 00:20:15.429 } 00:20:15.429 } 00:20:15.429 }' 00:20:15.429 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.429 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.429 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.429 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.429 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.429 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.429 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.690 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.690 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:15.690 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.690 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.690 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:15.690 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.690 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:15.690 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.950 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.950 "name": "pt2", 00:20:15.950 "aliases": [ 00:20:15.950 "00000000-0000-0000-0000-000000000002" 00:20:15.950 ], 00:20:15.950 "product_name": "passthru", 00:20:15.950 "block_size": 512, 00:20:15.950 "num_blocks": 65536, 00:20:15.950 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:15.950 "assigned_rate_limits": { 00:20:15.950 "rw_ios_per_sec": 0, 00:20:15.950 "rw_mbytes_per_sec": 0, 00:20:15.950 "r_mbytes_per_sec": 0, 00:20:15.950 "w_mbytes_per_sec": 0 00:20:15.950 }, 00:20:15.951 "claimed": true, 00:20:15.951 "claim_type": "exclusive_write", 00:20:15.951 "zoned": false, 00:20:15.951 "supported_io_types": { 00:20:15.951 "read": true, 00:20:15.951 "write": true, 00:20:15.951 "unmap": true, 00:20:15.951 "flush": true, 00:20:15.951 "reset": true, 00:20:15.951 "nvme_admin": false, 00:20:15.951 "nvme_io": false, 00:20:15.951 "nvme_io_md": false, 00:20:15.951 "write_zeroes": true, 00:20:15.951 "zcopy": true, 00:20:15.951 "get_zone_info": false, 00:20:15.951 "zone_management": false, 00:20:15.951 "zone_append": false, 00:20:15.951 "compare": false, 00:20:15.951 "compare_and_write": false, 00:20:15.951 "abort": true, 00:20:15.951 "seek_hole": false, 00:20:15.951 "seek_data": false, 00:20:15.951 "copy": true, 00:20:15.951 "nvme_iov_md": false 00:20:15.951 }, 00:20:15.951 "memory_domains": [ 00:20:15.951 { 00:20:15.951 "dma_device_id": "system", 00:20:15.951 "dma_device_type": 1 00:20:15.951 }, 00:20:15.951 { 00:20:15.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.951 "dma_device_type": 2 00:20:15.951 } 00:20:15.951 ], 00:20:15.951 "driver_specific": { 00:20:15.951 "passthru": { 00:20:15.951 "name": "pt2", 00:20:15.951 "base_bdev_name": "malloc2" 00:20:15.951 } 00:20:15.951 } 00:20:15.951 }' 00:20:15.951 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.951 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.951 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.951 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.951 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:16.210 07:56:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:16.470 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:16.470 "name": "pt3", 00:20:16.470 "aliases": [ 00:20:16.470 "00000000-0000-0000-0000-000000000003" 00:20:16.470 ], 00:20:16.470 "product_name": "passthru", 00:20:16.470 "block_size": 512, 00:20:16.470 "num_blocks": 65536, 00:20:16.470 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:16.470 "assigned_rate_limits": { 00:20:16.470 "rw_ios_per_sec": 0, 00:20:16.470 "rw_mbytes_per_sec": 0, 00:20:16.470 "r_mbytes_per_sec": 0, 00:20:16.470 "w_mbytes_per_sec": 0 00:20:16.470 }, 00:20:16.470 "claimed": true, 00:20:16.470 "claim_type": "exclusive_write", 00:20:16.470 "zoned": false, 00:20:16.470 "supported_io_types": { 00:20:16.470 "read": true, 00:20:16.470 "write": true, 00:20:16.470 "unmap": true, 00:20:16.470 "flush": true, 00:20:16.470 "reset": true, 00:20:16.470 "nvme_admin": false, 00:20:16.470 "nvme_io": false, 00:20:16.470 "nvme_io_md": false, 00:20:16.470 "write_zeroes": true, 00:20:16.470 "zcopy": true, 00:20:16.470 "get_zone_info": false, 00:20:16.470 "zone_management": false, 00:20:16.470 "zone_append": false, 00:20:16.470 "compare": false, 00:20:16.470 "compare_and_write": false, 00:20:16.470 "abort": true, 00:20:16.470 "seek_hole": false, 00:20:16.470 "seek_data": false, 00:20:16.470 "copy": true, 00:20:16.470 "nvme_iov_md": false 00:20:16.470 }, 00:20:16.470 "memory_domains": [ 00:20:16.470 { 00:20:16.470 "dma_device_id": "system", 00:20:16.470 "dma_device_type": 1 00:20:16.470 }, 00:20:16.470 { 00:20:16.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.470 "dma_device_type": 2 00:20:16.470 } 00:20:16.470 ], 00:20:16.470 "driver_specific": { 00:20:16.470 "passthru": { 00:20:16.470 "name": "pt3", 00:20:16.470 "base_bdev_name": "malloc3" 00:20:16.470 } 00:20:16.470 } 00:20:16.470 }' 00:20:16.470 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.470 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.731 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.029 "name": "pt4", 00:20:17.029 "aliases": [ 00:20:17.029 "00000000-0000-0000-0000-000000000004" 00:20:17.029 ], 00:20:17.029 "product_name": "passthru", 00:20:17.029 "block_size": 512, 00:20:17.029 "num_blocks": 65536, 00:20:17.029 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:17.029 "assigned_rate_limits": { 00:20:17.029 "rw_ios_per_sec": 0, 00:20:17.029 "rw_mbytes_per_sec": 0, 00:20:17.029 "r_mbytes_per_sec": 0, 00:20:17.029 "w_mbytes_per_sec": 0 00:20:17.029 }, 00:20:17.029 "claimed": true, 00:20:17.029 "claim_type": "exclusive_write", 00:20:17.029 "zoned": false, 00:20:17.029 "supported_io_types": { 00:20:17.029 "read": true, 00:20:17.029 "write": true, 00:20:17.029 "unmap": true, 00:20:17.029 "flush": true, 00:20:17.029 "reset": true, 00:20:17.029 "nvme_admin": false, 00:20:17.029 "nvme_io": false, 00:20:17.029 "nvme_io_md": false, 00:20:17.029 "write_zeroes": true, 00:20:17.029 "zcopy": true, 00:20:17.029 "get_zone_info": false, 00:20:17.029 "zone_management": false, 00:20:17.029 "zone_append": false, 00:20:17.029 "compare": false, 00:20:17.029 "compare_and_write": false, 00:20:17.029 "abort": true, 00:20:17.029 "seek_hole": false, 00:20:17.029 "seek_data": false, 00:20:17.029 "copy": true, 00:20:17.029 "nvme_iov_md": false 00:20:17.029 }, 00:20:17.029 "memory_domains": [ 00:20:17.029 { 00:20:17.029 "dma_device_id": "system", 00:20:17.029 "dma_device_type": 1 00:20:17.029 }, 00:20:17.029 { 00:20:17.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.029 "dma_device_type": 2 00:20:17.029 } 00:20:17.029 ], 00:20:17.029 "driver_specific": { 00:20:17.029 "passthru": { 00:20:17.029 "name": "pt4", 00:20:17.029 "base_bdev_name": "malloc4" 00:20:17.029 } 00:20:17.029 } 00:20:17.029 }' 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:17.029 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.308 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.308 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:17.308 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.308 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.308 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:17.308 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.308 07:56:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.308 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:17.308 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:17.308 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:17.568 [2024-07-15 07:56:02.199269] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:17.568 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=546ab423-1120-4391-88e3-6f52dab2722f 00:20:17.568 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 546ab423-1120-4391-88e3-6f52dab2722f ']' 00:20:17.568 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:17.829 [2024-07-15 07:56:02.391508] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:17.829 [2024-07-15 07:56:02.391523] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:17.829 [2024-07-15 07:56:02.391559] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:17.829 [2024-07-15 07:56:02.391616] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:17.829 [2024-07-15 07:56:02.391623] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a89e20 name raid_bdev1, state offline 00:20:17.829 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.829 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:18.090 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:18.090 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:18.090 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:18.090 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:18.090 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:18.090 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:18.350 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:18.350 07:56:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:18.627 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:18.627 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:18.627 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:18.627 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:18.888 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:19.147 [2024-07-15 07:56:03.710800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:19.147 [2024-07-15 07:56:03.711863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:19.147 [2024-07-15 07:56:03.711898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:19.147 [2024-07-15 07:56:03.711924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:19.147 [2024-07-15 07:56:03.711957] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:19.147 [2024-07-15 07:56:03.711983] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:19.147 [2024-07-15 07:56:03.711996] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:19.147 [2024-07-15 07:56:03.712009] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:19.147 [2024-07-15 07:56:03.712019] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:19.147 [2024-07-15 07:56:03.712025] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18ddee0 name raid_bdev1, state configuring 00:20:19.147 request: 00:20:19.147 { 00:20:19.147 "name": "raid_bdev1", 00:20:19.147 "raid_level": "raid1", 00:20:19.147 "base_bdevs": [ 00:20:19.147 "malloc1", 00:20:19.147 "malloc2", 00:20:19.147 "malloc3", 00:20:19.147 "malloc4" 00:20:19.147 ], 00:20:19.147 "superblock": false, 00:20:19.147 "method": "bdev_raid_create", 00:20:19.147 "req_id": 1 00:20:19.147 } 00:20:19.147 Got JSON-RPC error response 00:20:19.147 response: 00:20:19.147 { 00:20:19.147 "code": -17, 00:20:19.147 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:19.147 } 00:20:19.147 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:19.147 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:19.147 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:19.147 07:56:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:19.147 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.147 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:19.407 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:19.407 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:19.407 07:56:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:19.407 [2024-07-15 07:56:04.095732] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:19.407 [2024-07-15 07:56:04.095763] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:19.407 [2024-07-15 07:56:04.095774] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8da20 00:20:19.407 [2024-07-15 07:56:04.095780] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:19.407 [2024-07-15 07:56:04.097074] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:19.407 [2024-07-15 07:56:04.097093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:19.407 [2024-07-15 07:56:04.097140] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:19.407 [2024-07-15 07:56:04.097159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:19.407 pt1 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:19.407 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.666 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.666 "name": "raid_bdev1", 00:20:19.666 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:19.666 "strip_size_kb": 0, 00:20:19.666 "state": "configuring", 00:20:19.666 "raid_level": "raid1", 00:20:19.666 "superblock": true, 00:20:19.666 "num_base_bdevs": 4, 00:20:19.666 "num_base_bdevs_discovered": 1, 00:20:19.666 "num_base_bdevs_operational": 4, 00:20:19.666 "base_bdevs_list": [ 00:20:19.666 { 00:20:19.666 "name": "pt1", 00:20:19.666 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:19.666 "is_configured": true, 00:20:19.666 "data_offset": 2048, 00:20:19.666 "data_size": 63488 00:20:19.666 }, 00:20:19.666 { 00:20:19.666 "name": null, 00:20:19.666 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:19.666 "is_configured": false, 00:20:19.666 "data_offset": 2048, 00:20:19.666 "data_size": 63488 00:20:19.666 }, 00:20:19.666 { 00:20:19.666 "name": null, 00:20:19.666 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:19.666 "is_configured": false, 00:20:19.666 "data_offset": 2048, 00:20:19.666 "data_size": 63488 00:20:19.666 }, 00:20:19.666 { 00:20:19.666 "name": null, 00:20:19.666 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:19.666 "is_configured": false, 00:20:19.666 "data_offset": 2048, 00:20:19.666 "data_size": 63488 00:20:19.666 } 00:20:19.666 ] 00:20:19.666 }' 00:20:19.666 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.666 07:56:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.234 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:20.234 07:56:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:20.494 [2024-07-15 07:56:05.034101] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:20.494 [2024-07-15 07:56:05.034137] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.494 [2024-07-15 07:56:05.034149] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18de160 00:20:20.494 [2024-07-15 07:56:05.034161] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.494 [2024-07-15 07:56:05.034430] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.494 [2024-07-15 07:56:05.034441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:20.494 [2024-07-15 07:56:05.034486] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:20.494 [2024-07-15 07:56:05.034499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:20.494 pt2 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:20.494 [2024-07-15 07:56:05.218574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.494 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:20.754 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.754 "name": "raid_bdev1", 00:20:20.754 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:20.754 "strip_size_kb": 0, 00:20:20.754 "state": "configuring", 00:20:20.754 "raid_level": "raid1", 00:20:20.754 "superblock": true, 00:20:20.754 "num_base_bdevs": 4, 00:20:20.754 "num_base_bdevs_discovered": 1, 00:20:20.754 "num_base_bdevs_operational": 4, 00:20:20.754 "base_bdevs_list": [ 00:20:20.754 { 00:20:20.754 "name": "pt1", 00:20:20.754 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:20.754 "is_configured": true, 00:20:20.754 "data_offset": 2048, 00:20:20.754 "data_size": 63488 00:20:20.754 }, 00:20:20.754 { 00:20:20.754 "name": null, 00:20:20.754 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:20.754 "is_configured": false, 00:20:20.754 "data_offset": 2048, 00:20:20.754 "data_size": 63488 00:20:20.754 }, 00:20:20.754 { 00:20:20.754 "name": null, 00:20:20.754 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:20.754 "is_configured": false, 00:20:20.754 "data_offset": 2048, 00:20:20.754 "data_size": 63488 00:20:20.754 }, 00:20:20.754 { 00:20:20.754 "name": null, 00:20:20.754 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:20.754 "is_configured": false, 00:20:20.754 "data_offset": 2048, 00:20:20.754 "data_size": 63488 00:20:20.754 } 00:20:20.754 ] 00:20:20.754 }' 00:20:20.754 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.754 07:56:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:21.323 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:21.323 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:21.323 07:56:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:21.583 [2024-07-15 07:56:06.144922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:21.583 [2024-07-15 07:56:06.144956] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:21.583 [2024-07-15 07:56:06.144968] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18dedf0 00:20:21.583 [2024-07-15 07:56:06.144979] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:21.583 [2024-07-15 07:56:06.145241] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:21.583 [2024-07-15 07:56:06.145252] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:21.583 [2024-07-15 07:56:06.145297] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:21.583 [2024-07-15 07:56:06.145309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:21.583 pt2 00:20:21.583 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:21.583 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:21.583 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:21.843 [2024-07-15 07:56:06.341416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:21.843 [2024-07-15 07:56:06.341441] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:21.843 [2024-07-15 07:56:06.341450] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8cf90 00:20:21.843 [2024-07-15 07:56:06.341457] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:21.843 [2024-07-15 07:56:06.341681] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:21.843 [2024-07-15 07:56:06.341691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:21.843 [2024-07-15 07:56:06.341734] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:21.843 [2024-07-15 07:56:06.341746] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:21.843 pt3 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:21.844 [2024-07-15 07:56:06.537907] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:21.844 [2024-07-15 07:56:06.537928] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:21.844 [2024-07-15 07:56:06.537938] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18ddb30 00:20:21.844 [2024-07-15 07:56:06.537944] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:21.844 [2024-07-15 07:56:06.538157] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:21.844 [2024-07-15 07:56:06.538167] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:21.844 [2024-07-15 07:56:06.538199] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:21.844 [2024-07-15 07:56:06.538209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:21.844 [2024-07-15 07:56:06.538300] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a88aa0 00:20:21.844 [2024-07-15 07:56:06.538305] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:21.844 [2024-07-15 07:56:06.538443] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8a260 00:20:21.844 [2024-07-15 07:56:06.538548] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a88aa0 00:20:21.844 [2024-07-15 07:56:06.538553] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a88aa0 00:20:21.844 [2024-07-15 07:56:06.538623] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:21.844 pt4 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.844 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.105 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.105 "name": "raid_bdev1", 00:20:22.105 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:22.105 "strip_size_kb": 0, 00:20:22.105 "state": "online", 00:20:22.105 "raid_level": "raid1", 00:20:22.105 "superblock": true, 00:20:22.105 "num_base_bdevs": 4, 00:20:22.105 "num_base_bdevs_discovered": 4, 00:20:22.105 "num_base_bdevs_operational": 4, 00:20:22.105 "base_bdevs_list": [ 00:20:22.105 { 00:20:22.105 "name": "pt1", 00:20:22.105 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:22.105 "is_configured": true, 00:20:22.105 "data_offset": 2048, 00:20:22.105 "data_size": 63488 00:20:22.105 }, 00:20:22.105 { 00:20:22.105 "name": "pt2", 00:20:22.105 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:22.105 "is_configured": true, 00:20:22.105 "data_offset": 2048, 00:20:22.105 "data_size": 63488 00:20:22.105 }, 00:20:22.105 { 00:20:22.105 "name": "pt3", 00:20:22.105 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:22.105 "is_configured": true, 00:20:22.105 "data_offset": 2048, 00:20:22.105 "data_size": 63488 00:20:22.105 }, 00:20:22.105 { 00:20:22.105 "name": "pt4", 00:20:22.105 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:22.105 "is_configured": true, 00:20:22.105 "data_offset": 2048, 00:20:22.105 "data_size": 63488 00:20:22.105 } 00:20:22.105 ] 00:20:22.105 }' 00:20:22.105 07:56:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.105 07:56:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:22.674 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:22.674 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:22.674 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:22.674 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:22.674 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:22.674 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:22.674 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:22.674 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:22.933 [2024-07-15 07:56:07.472525] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:22.933 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:22.933 "name": "raid_bdev1", 00:20:22.933 "aliases": [ 00:20:22.933 "546ab423-1120-4391-88e3-6f52dab2722f" 00:20:22.933 ], 00:20:22.934 "product_name": "Raid Volume", 00:20:22.934 "block_size": 512, 00:20:22.934 "num_blocks": 63488, 00:20:22.934 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:22.934 "assigned_rate_limits": { 00:20:22.934 "rw_ios_per_sec": 0, 00:20:22.934 "rw_mbytes_per_sec": 0, 00:20:22.934 "r_mbytes_per_sec": 0, 00:20:22.934 "w_mbytes_per_sec": 0 00:20:22.934 }, 00:20:22.934 "claimed": false, 00:20:22.934 "zoned": false, 00:20:22.934 "supported_io_types": { 00:20:22.934 "read": true, 00:20:22.934 "write": true, 00:20:22.934 "unmap": false, 00:20:22.934 "flush": false, 00:20:22.934 "reset": true, 00:20:22.934 "nvme_admin": false, 00:20:22.934 "nvme_io": false, 00:20:22.934 "nvme_io_md": false, 00:20:22.934 "write_zeroes": true, 00:20:22.934 "zcopy": false, 00:20:22.934 "get_zone_info": false, 00:20:22.934 "zone_management": false, 00:20:22.934 "zone_append": false, 00:20:22.934 "compare": false, 00:20:22.934 "compare_and_write": false, 00:20:22.934 "abort": false, 00:20:22.934 "seek_hole": false, 00:20:22.934 "seek_data": false, 00:20:22.934 "copy": false, 00:20:22.934 "nvme_iov_md": false 00:20:22.934 }, 00:20:22.934 "memory_domains": [ 00:20:22.934 { 00:20:22.934 "dma_device_id": "system", 00:20:22.934 "dma_device_type": 1 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.934 "dma_device_type": 2 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "dma_device_id": "system", 00:20:22.934 "dma_device_type": 1 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.934 "dma_device_type": 2 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "dma_device_id": "system", 00:20:22.934 "dma_device_type": 1 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.934 "dma_device_type": 2 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "dma_device_id": "system", 00:20:22.934 "dma_device_type": 1 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.934 "dma_device_type": 2 00:20:22.934 } 00:20:22.934 ], 00:20:22.934 "driver_specific": { 00:20:22.934 "raid": { 00:20:22.934 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:22.934 "strip_size_kb": 0, 00:20:22.934 "state": "online", 00:20:22.934 "raid_level": "raid1", 00:20:22.934 "superblock": true, 00:20:22.934 "num_base_bdevs": 4, 00:20:22.934 "num_base_bdevs_discovered": 4, 00:20:22.934 "num_base_bdevs_operational": 4, 00:20:22.934 "base_bdevs_list": [ 00:20:22.934 { 00:20:22.934 "name": "pt1", 00:20:22.934 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:22.934 "is_configured": true, 00:20:22.934 "data_offset": 2048, 00:20:22.934 "data_size": 63488 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "name": "pt2", 00:20:22.934 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:22.934 "is_configured": true, 00:20:22.934 "data_offset": 2048, 00:20:22.934 "data_size": 63488 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "name": "pt3", 00:20:22.934 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:22.934 "is_configured": true, 00:20:22.934 "data_offset": 2048, 00:20:22.934 "data_size": 63488 00:20:22.934 }, 00:20:22.934 { 00:20:22.934 "name": "pt4", 00:20:22.934 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:22.934 "is_configured": true, 00:20:22.934 "data_offset": 2048, 00:20:22.934 "data_size": 63488 00:20:22.934 } 00:20:22.934 ] 00:20:22.934 } 00:20:22.934 } 00:20:22.934 }' 00:20:22.934 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:22.934 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:22.934 pt2 00:20:22.934 pt3 00:20:22.934 pt4' 00:20:22.934 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.934 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.934 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:23.193 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.193 "name": "pt1", 00:20:23.193 "aliases": [ 00:20:23.193 "00000000-0000-0000-0000-000000000001" 00:20:23.193 ], 00:20:23.193 "product_name": "passthru", 00:20:23.193 "block_size": 512, 00:20:23.193 "num_blocks": 65536, 00:20:23.193 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:23.194 "assigned_rate_limits": { 00:20:23.194 "rw_ios_per_sec": 0, 00:20:23.194 "rw_mbytes_per_sec": 0, 00:20:23.194 "r_mbytes_per_sec": 0, 00:20:23.194 "w_mbytes_per_sec": 0 00:20:23.194 }, 00:20:23.194 "claimed": true, 00:20:23.194 "claim_type": "exclusive_write", 00:20:23.194 "zoned": false, 00:20:23.194 "supported_io_types": { 00:20:23.194 "read": true, 00:20:23.194 "write": true, 00:20:23.194 "unmap": true, 00:20:23.194 "flush": true, 00:20:23.194 "reset": true, 00:20:23.194 "nvme_admin": false, 00:20:23.194 "nvme_io": false, 00:20:23.194 "nvme_io_md": false, 00:20:23.194 "write_zeroes": true, 00:20:23.194 "zcopy": true, 00:20:23.194 "get_zone_info": false, 00:20:23.194 "zone_management": false, 00:20:23.194 "zone_append": false, 00:20:23.194 "compare": false, 00:20:23.194 "compare_and_write": false, 00:20:23.194 "abort": true, 00:20:23.194 "seek_hole": false, 00:20:23.194 "seek_data": false, 00:20:23.194 "copy": true, 00:20:23.194 "nvme_iov_md": false 00:20:23.194 }, 00:20:23.194 "memory_domains": [ 00:20:23.194 { 00:20:23.194 "dma_device_id": "system", 00:20:23.194 "dma_device_type": 1 00:20:23.194 }, 00:20:23.194 { 00:20:23.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.194 "dma_device_type": 2 00:20:23.194 } 00:20:23.194 ], 00:20:23.194 "driver_specific": { 00:20:23.194 "passthru": { 00:20:23.194 "name": "pt1", 00:20:23.194 "base_bdev_name": "malloc1" 00:20:23.194 } 00:20:23.194 } 00:20:23.194 }' 00:20:23.194 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.194 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.194 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.194 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.194 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.194 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.194 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.194 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.453 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.453 07:56:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.453 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.453 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.453 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.453 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:23.453 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.711 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.711 "name": "pt2", 00:20:23.711 "aliases": [ 00:20:23.711 "00000000-0000-0000-0000-000000000002" 00:20:23.711 ], 00:20:23.711 "product_name": "passthru", 00:20:23.711 "block_size": 512, 00:20:23.711 "num_blocks": 65536, 00:20:23.711 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:23.711 "assigned_rate_limits": { 00:20:23.711 "rw_ios_per_sec": 0, 00:20:23.711 "rw_mbytes_per_sec": 0, 00:20:23.711 "r_mbytes_per_sec": 0, 00:20:23.711 "w_mbytes_per_sec": 0 00:20:23.711 }, 00:20:23.711 "claimed": true, 00:20:23.711 "claim_type": "exclusive_write", 00:20:23.711 "zoned": false, 00:20:23.711 "supported_io_types": { 00:20:23.711 "read": true, 00:20:23.711 "write": true, 00:20:23.711 "unmap": true, 00:20:23.711 "flush": true, 00:20:23.711 "reset": true, 00:20:23.711 "nvme_admin": false, 00:20:23.711 "nvme_io": false, 00:20:23.711 "nvme_io_md": false, 00:20:23.711 "write_zeroes": true, 00:20:23.711 "zcopy": true, 00:20:23.712 "get_zone_info": false, 00:20:23.712 "zone_management": false, 00:20:23.712 "zone_append": false, 00:20:23.712 "compare": false, 00:20:23.712 "compare_and_write": false, 00:20:23.712 "abort": true, 00:20:23.712 "seek_hole": false, 00:20:23.712 "seek_data": false, 00:20:23.712 "copy": true, 00:20:23.712 "nvme_iov_md": false 00:20:23.712 }, 00:20:23.712 "memory_domains": [ 00:20:23.712 { 00:20:23.712 "dma_device_id": "system", 00:20:23.712 "dma_device_type": 1 00:20:23.712 }, 00:20:23.712 { 00:20:23.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.712 "dma_device_type": 2 00:20:23.712 } 00:20:23.712 ], 00:20:23.712 "driver_specific": { 00:20:23.712 "passthru": { 00:20:23.712 "name": "pt2", 00:20:23.712 "base_bdev_name": "malloc2" 00:20:23.712 } 00:20:23.712 } 00:20:23.712 }' 00:20:23.712 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.712 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.712 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.712 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.712 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.712 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.712 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.971 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.971 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.971 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.971 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.971 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.971 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.971 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:23.971 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.229 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.229 "name": "pt3", 00:20:24.229 "aliases": [ 00:20:24.229 "00000000-0000-0000-0000-000000000003" 00:20:24.229 ], 00:20:24.229 "product_name": "passthru", 00:20:24.229 "block_size": 512, 00:20:24.229 "num_blocks": 65536, 00:20:24.229 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:24.229 "assigned_rate_limits": { 00:20:24.229 "rw_ios_per_sec": 0, 00:20:24.229 "rw_mbytes_per_sec": 0, 00:20:24.229 "r_mbytes_per_sec": 0, 00:20:24.229 "w_mbytes_per_sec": 0 00:20:24.229 }, 00:20:24.229 "claimed": true, 00:20:24.229 "claim_type": "exclusive_write", 00:20:24.229 "zoned": false, 00:20:24.229 "supported_io_types": { 00:20:24.229 "read": true, 00:20:24.229 "write": true, 00:20:24.229 "unmap": true, 00:20:24.229 "flush": true, 00:20:24.229 "reset": true, 00:20:24.229 "nvme_admin": false, 00:20:24.229 "nvme_io": false, 00:20:24.229 "nvme_io_md": false, 00:20:24.229 "write_zeroes": true, 00:20:24.229 "zcopy": true, 00:20:24.229 "get_zone_info": false, 00:20:24.229 "zone_management": false, 00:20:24.229 "zone_append": false, 00:20:24.229 "compare": false, 00:20:24.229 "compare_and_write": false, 00:20:24.229 "abort": true, 00:20:24.229 "seek_hole": false, 00:20:24.229 "seek_data": false, 00:20:24.229 "copy": true, 00:20:24.229 "nvme_iov_md": false 00:20:24.229 }, 00:20:24.229 "memory_domains": [ 00:20:24.229 { 00:20:24.229 "dma_device_id": "system", 00:20:24.229 "dma_device_type": 1 00:20:24.229 }, 00:20:24.229 { 00:20:24.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.230 "dma_device_type": 2 00:20:24.230 } 00:20:24.230 ], 00:20:24.230 "driver_specific": { 00:20:24.230 "passthru": { 00:20:24.230 "name": "pt3", 00:20:24.230 "base_bdev_name": "malloc3" 00:20:24.230 } 00:20:24.230 } 00:20:24.230 }' 00:20:24.230 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.230 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.230 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.230 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.230 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.230 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.230 07:56:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.488 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.488 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.488 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.488 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.488 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.488 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:24.488 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:24.488 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.748 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.748 "name": "pt4", 00:20:24.748 "aliases": [ 00:20:24.748 "00000000-0000-0000-0000-000000000004" 00:20:24.748 ], 00:20:24.748 "product_name": "passthru", 00:20:24.748 "block_size": 512, 00:20:24.748 "num_blocks": 65536, 00:20:24.748 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:24.748 "assigned_rate_limits": { 00:20:24.748 "rw_ios_per_sec": 0, 00:20:24.748 "rw_mbytes_per_sec": 0, 00:20:24.748 "r_mbytes_per_sec": 0, 00:20:24.748 "w_mbytes_per_sec": 0 00:20:24.748 }, 00:20:24.748 "claimed": true, 00:20:24.748 "claim_type": "exclusive_write", 00:20:24.748 "zoned": false, 00:20:24.748 "supported_io_types": { 00:20:24.748 "read": true, 00:20:24.748 "write": true, 00:20:24.748 "unmap": true, 00:20:24.748 "flush": true, 00:20:24.748 "reset": true, 00:20:24.748 "nvme_admin": false, 00:20:24.748 "nvme_io": false, 00:20:24.748 "nvme_io_md": false, 00:20:24.748 "write_zeroes": true, 00:20:24.748 "zcopy": true, 00:20:24.748 "get_zone_info": false, 00:20:24.748 "zone_management": false, 00:20:24.748 "zone_append": false, 00:20:24.748 "compare": false, 00:20:24.748 "compare_and_write": false, 00:20:24.748 "abort": true, 00:20:24.748 "seek_hole": false, 00:20:24.748 "seek_data": false, 00:20:24.748 "copy": true, 00:20:24.748 "nvme_iov_md": false 00:20:24.748 }, 00:20:24.748 "memory_domains": [ 00:20:24.748 { 00:20:24.748 "dma_device_id": "system", 00:20:24.748 "dma_device_type": 1 00:20:24.748 }, 00:20:24.748 { 00:20:24.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.748 "dma_device_type": 2 00:20:24.748 } 00:20:24.748 ], 00:20:24.748 "driver_specific": { 00:20:24.748 "passthru": { 00:20:24.748 "name": "pt4", 00:20:24.748 "base_bdev_name": "malloc4" 00:20:24.748 } 00:20:24.748 } 00:20:24.748 }' 00:20:24.748 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.748 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.748 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.748 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.748 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.748 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.748 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.007 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.007 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:25.007 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.007 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.007 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:25.007 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:25.007 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:25.265 [2024-07-15 07:56:09.846536] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:25.265 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 546ab423-1120-4391-88e3-6f52dab2722f '!=' 546ab423-1120-4391-88e3-6f52dab2722f ']' 00:20:25.265 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:20:25.265 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:25.265 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:25.265 07:56:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:25.524 [2024-07-15 07:56:10.046826] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.524 "name": "raid_bdev1", 00:20:25.524 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:25.524 "strip_size_kb": 0, 00:20:25.524 "state": "online", 00:20:25.524 "raid_level": "raid1", 00:20:25.524 "superblock": true, 00:20:25.524 "num_base_bdevs": 4, 00:20:25.524 "num_base_bdevs_discovered": 3, 00:20:25.524 "num_base_bdevs_operational": 3, 00:20:25.524 "base_bdevs_list": [ 00:20:25.524 { 00:20:25.524 "name": null, 00:20:25.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.524 "is_configured": false, 00:20:25.524 "data_offset": 2048, 00:20:25.524 "data_size": 63488 00:20:25.524 }, 00:20:25.524 { 00:20:25.524 "name": "pt2", 00:20:25.524 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:25.524 "is_configured": true, 00:20:25.524 "data_offset": 2048, 00:20:25.524 "data_size": 63488 00:20:25.524 }, 00:20:25.524 { 00:20:25.524 "name": "pt3", 00:20:25.524 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:25.524 "is_configured": true, 00:20:25.524 "data_offset": 2048, 00:20:25.524 "data_size": 63488 00:20:25.524 }, 00:20:25.524 { 00:20:25.524 "name": "pt4", 00:20:25.524 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:25.524 "is_configured": true, 00:20:25.524 "data_offset": 2048, 00:20:25.524 "data_size": 63488 00:20:25.524 } 00:20:25.524 ] 00:20:25.524 }' 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.524 07:56:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.091 07:56:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:26.351 [2024-07-15 07:56:10.989188] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:26.351 [2024-07-15 07:56:10.989206] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:26.351 [2024-07-15 07:56:10.989244] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:26.351 [2024-07-15 07:56:10.989297] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:26.351 [2024-07-15 07:56:10.989303] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a88aa0 name raid_bdev1, state offline 00:20:26.351 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.351 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:20:26.611 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:20:26.611 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:20:26.611 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:20:26.611 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:20:26.611 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:26.871 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:20:26.871 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:20:26.871 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:26.871 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:20:26.871 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:20:26.871 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:27.131 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:20:27.131 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:20:27.131 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:20:27.131 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:20:27.131 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:27.392 [2024-07-15 07:56:11.927524] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:27.392 [2024-07-15 07:56:11.927558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:27.392 [2024-07-15 07:56:11.927571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8fe50 00:20:27.392 [2024-07-15 07:56:11.927577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:27.392 [2024-07-15 07:56:11.928891] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:27.392 [2024-07-15 07:56:11.928912] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:27.392 [2024-07-15 07:56:11.928960] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:27.392 [2024-07-15 07:56:11.928979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:27.392 pt2 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.392 07:56:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.392 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.392 "name": "raid_bdev1", 00:20:27.392 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:27.392 "strip_size_kb": 0, 00:20:27.392 "state": "configuring", 00:20:27.392 "raid_level": "raid1", 00:20:27.392 "superblock": true, 00:20:27.392 "num_base_bdevs": 4, 00:20:27.392 "num_base_bdevs_discovered": 1, 00:20:27.392 "num_base_bdevs_operational": 3, 00:20:27.392 "base_bdevs_list": [ 00:20:27.392 { 00:20:27.392 "name": null, 00:20:27.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.392 "is_configured": false, 00:20:27.392 "data_offset": 2048, 00:20:27.392 "data_size": 63488 00:20:27.392 }, 00:20:27.392 { 00:20:27.392 "name": "pt2", 00:20:27.392 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:27.392 "is_configured": true, 00:20:27.392 "data_offset": 2048, 00:20:27.392 "data_size": 63488 00:20:27.392 }, 00:20:27.392 { 00:20:27.392 "name": null, 00:20:27.392 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:27.392 "is_configured": false, 00:20:27.392 "data_offset": 2048, 00:20:27.392 "data_size": 63488 00:20:27.392 }, 00:20:27.392 { 00:20:27.392 "name": null, 00:20:27.392 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:27.392 "is_configured": false, 00:20:27.392 "data_offset": 2048, 00:20:27.392 "data_size": 63488 00:20:27.392 } 00:20:27.392 ] 00:20:27.392 }' 00:20:27.392 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.392 07:56:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.962 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:20:27.962 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:20:27.962 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:28.223 [2024-07-15 07:56:12.857881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:28.223 [2024-07-15 07:56:12.857915] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:28.223 [2024-07-15 07:56:12.857926] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8f990 00:20:28.223 [2024-07-15 07:56:12.857932] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:28.223 [2024-07-15 07:56:12.858202] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:28.223 [2024-07-15 07:56:12.858213] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:28.223 [2024-07-15 07:56:12.858257] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:28.223 [2024-07-15 07:56:12.858270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:28.223 pt3 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.223 07:56:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.484 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.484 "name": "raid_bdev1", 00:20:28.484 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:28.484 "strip_size_kb": 0, 00:20:28.484 "state": "configuring", 00:20:28.484 "raid_level": "raid1", 00:20:28.484 "superblock": true, 00:20:28.484 "num_base_bdevs": 4, 00:20:28.484 "num_base_bdevs_discovered": 2, 00:20:28.484 "num_base_bdevs_operational": 3, 00:20:28.484 "base_bdevs_list": [ 00:20:28.484 { 00:20:28.484 "name": null, 00:20:28.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.484 "is_configured": false, 00:20:28.484 "data_offset": 2048, 00:20:28.484 "data_size": 63488 00:20:28.484 }, 00:20:28.484 { 00:20:28.484 "name": "pt2", 00:20:28.484 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:28.484 "is_configured": true, 00:20:28.484 "data_offset": 2048, 00:20:28.484 "data_size": 63488 00:20:28.484 }, 00:20:28.484 { 00:20:28.484 "name": "pt3", 00:20:28.484 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:28.484 "is_configured": true, 00:20:28.484 "data_offset": 2048, 00:20:28.484 "data_size": 63488 00:20:28.484 }, 00:20:28.484 { 00:20:28.484 "name": null, 00:20:28.484 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:28.484 "is_configured": false, 00:20:28.484 "data_offset": 2048, 00:20:28.484 "data_size": 63488 00:20:28.484 } 00:20:28.484 ] 00:20:28.484 }' 00:20:28.484 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.484 07:56:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:29.054 [2024-07-15 07:56:13.792253] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:29.054 [2024-07-15 07:56:13.792287] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.054 [2024-07-15 07:56:13.792299] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18dd6a0 00:20:29.054 [2024-07-15 07:56:13.792306] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.054 [2024-07-15 07:56:13.792574] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.054 [2024-07-15 07:56:13.792585] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:29.054 [2024-07-15 07:56:13.792630] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:29.054 [2024-07-15 07:56:13.792643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:29.054 [2024-07-15 07:56:13.792740] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a8ca50 00:20:29.054 [2024-07-15 07:56:13.792747] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:29.054 [2024-07-15 07:56:13.792883] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8cd30 00:20:29.054 [2024-07-15 07:56:13.792984] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a8ca50 00:20:29.054 [2024-07-15 07:56:13.792990] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a8ca50 00:20:29.054 [2024-07-15 07:56:13.793068] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:29.054 pt4 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.054 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.314 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.314 07:56:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.314 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.314 "name": "raid_bdev1", 00:20:29.314 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:29.314 "strip_size_kb": 0, 00:20:29.314 "state": "online", 00:20:29.314 "raid_level": "raid1", 00:20:29.314 "superblock": true, 00:20:29.314 "num_base_bdevs": 4, 00:20:29.314 "num_base_bdevs_discovered": 3, 00:20:29.314 "num_base_bdevs_operational": 3, 00:20:29.314 "base_bdevs_list": [ 00:20:29.314 { 00:20:29.314 "name": null, 00:20:29.314 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.314 "is_configured": false, 00:20:29.314 "data_offset": 2048, 00:20:29.314 "data_size": 63488 00:20:29.314 }, 00:20:29.314 { 00:20:29.314 "name": "pt2", 00:20:29.314 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:29.314 "is_configured": true, 00:20:29.314 "data_offset": 2048, 00:20:29.314 "data_size": 63488 00:20:29.314 }, 00:20:29.314 { 00:20:29.314 "name": "pt3", 00:20:29.314 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:29.314 "is_configured": true, 00:20:29.314 "data_offset": 2048, 00:20:29.314 "data_size": 63488 00:20:29.314 }, 00:20:29.314 { 00:20:29.314 "name": "pt4", 00:20:29.314 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:29.314 "is_configured": true, 00:20:29.314 "data_offset": 2048, 00:20:29.314 "data_size": 63488 00:20:29.314 } 00:20:29.314 ] 00:20:29.314 }' 00:20:29.314 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.314 07:56:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.884 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:30.145 [2024-07-15 07:56:14.682498] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:30.145 [2024-07-15 07:56:14.682514] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:30.145 [2024-07-15 07:56:14.682553] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:30.145 [2024-07-15 07:56:14.682604] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:30.145 [2024-07-15 07:56:14.682610] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a8ca50 name raid_bdev1, state offline 00:20:30.145 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.145 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:20:30.145 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:20:30.145 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:20:30.145 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:20:30.145 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:20:30.145 07:56:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:30.405 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:30.405 [2024-07-15 07:56:15.151670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:30.405 [2024-07-15 07:56:15.151701] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.405 [2024-07-15 07:56:15.151714] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8f730 00:20:30.405 [2024-07-15 07:56:15.151720] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.405 [2024-07-15 07:56:15.153000] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.405 [2024-07-15 07:56:15.153019] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:30.405 [2024-07-15 07:56:15.153064] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:30.405 [2024-07-15 07:56:15.153082] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:30.405 [2024-07-15 07:56:15.153152] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:20:30.405 [2024-07-15 07:56:15.153160] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:30.405 [2024-07-15 07:56:15.153168] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a8dd40 name raid_bdev1, state configuring 00:20:30.406 [2024-07-15 07:56:15.153183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:30.406 [2024-07-15 07:56:15.153239] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:30.406 pt1 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.665 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:30.665 "name": "raid_bdev1", 00:20:30.665 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:30.665 "strip_size_kb": 0, 00:20:30.665 "state": "configuring", 00:20:30.665 "raid_level": "raid1", 00:20:30.665 "superblock": true, 00:20:30.665 "num_base_bdevs": 4, 00:20:30.666 "num_base_bdevs_discovered": 2, 00:20:30.666 "num_base_bdevs_operational": 3, 00:20:30.666 "base_bdevs_list": [ 00:20:30.666 { 00:20:30.666 "name": null, 00:20:30.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.666 "is_configured": false, 00:20:30.666 "data_offset": 2048, 00:20:30.666 "data_size": 63488 00:20:30.666 }, 00:20:30.666 { 00:20:30.666 "name": "pt2", 00:20:30.666 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:30.666 "is_configured": true, 00:20:30.666 "data_offset": 2048, 00:20:30.666 "data_size": 63488 00:20:30.666 }, 00:20:30.666 { 00:20:30.666 "name": "pt3", 00:20:30.666 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:30.666 "is_configured": true, 00:20:30.666 "data_offset": 2048, 00:20:30.666 "data_size": 63488 00:20:30.666 }, 00:20:30.666 { 00:20:30.666 "name": null, 00:20:30.666 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:30.666 "is_configured": false, 00:20:30.666 "data_offset": 2048, 00:20:30.666 "data_size": 63488 00:20:30.666 } 00:20:30.666 ] 00:20:30.666 }' 00:20:30.666 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:30.666 07:56:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:31.267 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:20:31.267 07:56:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:20:31.528 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:20:31.528 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:31.528 [2024-07-15 07:56:16.282531] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:31.528 [2024-07-15 07:56:16.282561] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:31.528 [2024-07-15 07:56:16.282572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8f3c0 00:20:31.528 [2024-07-15 07:56:16.282578] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:31.528 [2024-07-15 07:56:16.282845] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:31.528 [2024-07-15 07:56:16.282855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:31.528 [2024-07-15 07:56:16.282899] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:31.528 [2024-07-15 07:56:16.282911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:31.528 [2024-07-15 07:56:16.282996] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18ddc20 00:20:31.528 [2024-07-15 07:56:16.283002] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:31.528 [2024-07-15 07:56:16.283136] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8d530 00:20:31.528 [2024-07-15 07:56:16.283238] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18ddc20 00:20:31.528 [2024-07-15 07:56:16.283243] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18ddc20 00:20:31.528 [2024-07-15 07:56:16.283313] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:31.789 pt4 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.789 "name": "raid_bdev1", 00:20:31.789 "uuid": "546ab423-1120-4391-88e3-6f52dab2722f", 00:20:31.789 "strip_size_kb": 0, 00:20:31.789 "state": "online", 00:20:31.789 "raid_level": "raid1", 00:20:31.789 "superblock": true, 00:20:31.789 "num_base_bdevs": 4, 00:20:31.789 "num_base_bdevs_discovered": 3, 00:20:31.789 "num_base_bdevs_operational": 3, 00:20:31.789 "base_bdevs_list": [ 00:20:31.789 { 00:20:31.789 "name": null, 00:20:31.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.789 "is_configured": false, 00:20:31.789 "data_offset": 2048, 00:20:31.789 "data_size": 63488 00:20:31.789 }, 00:20:31.789 { 00:20:31.789 "name": "pt2", 00:20:31.789 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:31.789 "is_configured": true, 00:20:31.789 "data_offset": 2048, 00:20:31.789 "data_size": 63488 00:20:31.789 }, 00:20:31.789 { 00:20:31.789 "name": "pt3", 00:20:31.789 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:31.789 "is_configured": true, 00:20:31.789 "data_offset": 2048, 00:20:31.789 "data_size": 63488 00:20:31.789 }, 00:20:31.789 { 00:20:31.789 "name": "pt4", 00:20:31.789 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:31.789 "is_configured": true, 00:20:31.789 "data_offset": 2048, 00:20:31.789 "data_size": 63488 00:20:31.789 } 00:20:31.789 ] 00:20:31.789 }' 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.789 07:56:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.361 07:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:20:32.361 07:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:20:32.622 07:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:20:32.622 07:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:32.622 07:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:20:32.882 [2024-07-15 07:56:17.425651] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 546ab423-1120-4391-88e3-6f52dab2722f '!=' 546ab423-1120-4391-88e3-6f52dab2722f ']' 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1697547 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1697547 ']' 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1697547 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1697547 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1697547' 00:20:32.883 killing process with pid 1697547 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1697547 00:20:32.883 [2024-07-15 07:56:17.495340] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:32.883 [2024-07-15 07:56:17.495377] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:32.883 [2024-07-15 07:56:17.495426] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:32.883 [2024-07-15 07:56:17.495432] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18ddc20 name raid_bdev1, state offline 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1697547 00:20:32.883 [2024-07-15 07:56:17.516029] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:32.883 00:20:32.883 real 0m21.423s 00:20:32.883 user 0m40.064s 00:20:32.883 sys 0m3.115s 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:32.883 07:56:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.883 ************************************ 00:20:32.883 END TEST raid_superblock_test 00:20:32.883 ************************************ 00:20:33.143 07:56:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:33.143 07:56:17 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:20:33.143 07:56:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:33.143 07:56:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:33.143 07:56:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:33.143 ************************************ 00:20:33.143 START TEST raid_read_error_test 00:20:33.143 ************************************ 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.143 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.IRvkZzHbAr 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1701597 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1701597 /var/tmp/spdk-raid.sock 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1701597 ']' 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:33.144 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:33.144 07:56:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.144 [2024-07-15 07:56:17.789631] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:20:33.144 [2024-07-15 07:56:17.789690] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1701597 ] 00:20:33.144 [2024-07-15 07:56:17.879279] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:33.404 [2024-07-15 07:56:17.947354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:33.404 [2024-07-15 07:56:17.989733] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.404 [2024-07-15 07:56:17.989759] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.975 07:56:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:33.975 07:56:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:33.975 07:56:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:33.975 07:56:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:34.236 BaseBdev1_malloc 00:20:34.236 07:56:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:34.497 true 00:20:34.497 07:56:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:34.497 [2024-07-15 07:56:19.164559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:34.497 [2024-07-15 07:56:19.164589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.497 [2024-07-15 07:56:19.164600] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2040b50 00:20:34.497 [2024-07-15 07:56:19.164607] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.497 [2024-07-15 07:56:19.165910] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.497 [2024-07-15 07:56:19.165930] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:34.497 BaseBdev1 00:20:34.497 07:56:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:34.497 07:56:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:34.759 BaseBdev2_malloc 00:20:34.759 07:56:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:35.019 true 00:20:35.019 07:56:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:35.019 [2024-07-15 07:56:19.719925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:35.019 [2024-07-15 07:56:19.719952] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:35.019 [2024-07-15 07:56:19.719962] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2024ea0 00:20:35.019 [2024-07-15 07:56:19.719968] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:35.019 [2024-07-15 07:56:19.721143] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:35.019 [2024-07-15 07:56:19.721162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:35.019 BaseBdev2 00:20:35.019 07:56:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:35.019 07:56:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:35.279 BaseBdev3_malloc 00:20:35.279 07:56:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:35.539 true 00:20:35.539 07:56:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:35.539 [2024-07-15 07:56:20.279146] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:35.539 [2024-07-15 07:56:20.279174] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:35.539 [2024-07-15 07:56:20.279187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2028fb0 00:20:35.539 [2024-07-15 07:56:20.279194] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:35.539 [2024-07-15 07:56:20.280386] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:35.539 [2024-07-15 07:56:20.280405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:35.539 BaseBdev3 00:20:35.539 07:56:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:35.539 07:56:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:35.799 BaseBdev4_malloc 00:20:35.799 07:56:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:36.059 true 00:20:36.059 07:56:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:36.325 [2024-07-15 07:56:20.834401] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:36.325 [2024-07-15 07:56:20.834430] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:36.325 [2024-07-15 07:56:20.834443] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x202a980 00:20:36.325 [2024-07-15 07:56:20.834449] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:36.325 [2024-07-15 07:56:20.835632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:36.325 [2024-07-15 07:56:20.835651] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:36.325 BaseBdev4 00:20:36.325 07:56:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:36.325 [2024-07-15 07:56:21.010870] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:36.325 [2024-07-15 07:56:21.011872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:36.325 [2024-07-15 07:56:21.011932] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:36.325 [2024-07-15 07:56:21.011976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:36.325 [2024-07-15 07:56:21.012151] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x202a4e0 00:20:36.325 [2024-07-15 07:56:21.012158] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:36.325 [2024-07-15 07:56:21.012301] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e8c210 00:20:36.325 [2024-07-15 07:56:21.012421] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x202a4e0 00:20:36.325 [2024-07-15 07:56:21.012426] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x202a4e0 00:20:36.325 [2024-07-15 07:56:21.012502] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.325 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:36.584 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.584 "name": "raid_bdev1", 00:20:36.584 "uuid": "43623d10-4017-44b5-b9e6-6d39fab6c278", 00:20:36.584 "strip_size_kb": 0, 00:20:36.584 "state": "online", 00:20:36.584 "raid_level": "raid1", 00:20:36.584 "superblock": true, 00:20:36.584 "num_base_bdevs": 4, 00:20:36.584 "num_base_bdevs_discovered": 4, 00:20:36.584 "num_base_bdevs_operational": 4, 00:20:36.584 "base_bdevs_list": [ 00:20:36.584 { 00:20:36.584 "name": "BaseBdev1", 00:20:36.584 "uuid": "05e6c745-2b30-5abe-b7ad-8864724d757b", 00:20:36.584 "is_configured": true, 00:20:36.584 "data_offset": 2048, 00:20:36.584 "data_size": 63488 00:20:36.584 }, 00:20:36.584 { 00:20:36.584 "name": "BaseBdev2", 00:20:36.584 "uuid": "e831acbd-fee8-5603-8b85-21c78f4d7d30", 00:20:36.584 "is_configured": true, 00:20:36.584 "data_offset": 2048, 00:20:36.584 "data_size": 63488 00:20:36.584 }, 00:20:36.584 { 00:20:36.584 "name": "BaseBdev3", 00:20:36.584 "uuid": "bc910e8c-cca9-57a9-bef8-b705bb9d065e", 00:20:36.584 "is_configured": true, 00:20:36.584 "data_offset": 2048, 00:20:36.584 "data_size": 63488 00:20:36.584 }, 00:20:36.584 { 00:20:36.584 "name": "BaseBdev4", 00:20:36.584 "uuid": "850d3a3d-5968-5a9d-a3b4-97c9a485d348", 00:20:36.584 "is_configured": true, 00:20:36.584 "data_offset": 2048, 00:20:36.584 "data_size": 63488 00:20:36.584 } 00:20:36.584 ] 00:20:36.584 }' 00:20:36.584 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.584 07:56:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:37.151 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:37.151 07:56:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:37.151 [2024-07-15 07:56:21.861243] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x202fa50 00:20:38.088 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:38.347 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:38.348 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:38.348 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:38.348 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.348 07:56:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.609 07:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.609 "name": "raid_bdev1", 00:20:38.609 "uuid": "43623d10-4017-44b5-b9e6-6d39fab6c278", 00:20:38.609 "strip_size_kb": 0, 00:20:38.609 "state": "online", 00:20:38.609 "raid_level": "raid1", 00:20:38.609 "superblock": true, 00:20:38.609 "num_base_bdevs": 4, 00:20:38.609 "num_base_bdevs_discovered": 4, 00:20:38.609 "num_base_bdevs_operational": 4, 00:20:38.609 "base_bdevs_list": [ 00:20:38.609 { 00:20:38.609 "name": "BaseBdev1", 00:20:38.609 "uuid": "05e6c745-2b30-5abe-b7ad-8864724d757b", 00:20:38.609 "is_configured": true, 00:20:38.609 "data_offset": 2048, 00:20:38.609 "data_size": 63488 00:20:38.609 }, 00:20:38.609 { 00:20:38.609 "name": "BaseBdev2", 00:20:38.609 "uuid": "e831acbd-fee8-5603-8b85-21c78f4d7d30", 00:20:38.609 "is_configured": true, 00:20:38.609 "data_offset": 2048, 00:20:38.609 "data_size": 63488 00:20:38.609 }, 00:20:38.609 { 00:20:38.609 "name": "BaseBdev3", 00:20:38.609 "uuid": "bc910e8c-cca9-57a9-bef8-b705bb9d065e", 00:20:38.609 "is_configured": true, 00:20:38.609 "data_offset": 2048, 00:20:38.609 "data_size": 63488 00:20:38.609 }, 00:20:38.609 { 00:20:38.609 "name": "BaseBdev4", 00:20:38.609 "uuid": "850d3a3d-5968-5a9d-a3b4-97c9a485d348", 00:20:38.609 "is_configured": true, 00:20:38.609 "data_offset": 2048, 00:20:38.609 "data_size": 63488 00:20:38.609 } 00:20:38.609 ] 00:20:38.609 }' 00:20:38.609 07:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.609 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:39.177 07:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:39.177 [2024-07-15 07:56:23.859857] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:39.177 [2024-07-15 07:56:23.859884] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:39.177 [2024-07-15 07:56:23.862517] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:39.177 [2024-07-15 07:56:23.862550] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:39.177 [2024-07-15 07:56:23.862640] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:39.177 [2024-07-15 07:56:23.862646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x202a4e0 name raid_bdev1, state offline 00:20:39.177 0 00:20:39.177 07:56:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1701597 00:20:39.177 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1701597 ']' 00:20:39.177 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1701597 00:20:39.177 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:39.177 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:39.177 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1701597 00:20:39.437 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:39.437 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:39.437 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1701597' 00:20:39.437 killing process with pid 1701597 00:20:39.437 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1701597 00:20:39.437 [2024-07-15 07:56:23.940956] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:39.437 07:56:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1701597 00:20:39.437 [2024-07-15 07:56:23.958086] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.IRvkZzHbAr 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:20:39.437 00:20:39.437 real 0m6.374s 00:20:39.437 user 0m10.245s 00:20:39.437 sys 0m0.919s 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:39.437 07:56:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:39.437 ************************************ 00:20:39.437 END TEST raid_read_error_test 00:20:39.437 ************************************ 00:20:39.437 07:56:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:39.437 07:56:24 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:20:39.437 07:56:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:39.437 07:56:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:39.437 07:56:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:39.437 ************************************ 00:20:39.437 START TEST raid_write_error_test 00:20:39.437 ************************************ 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.VDO5KAIgmF 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1702767 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1702767 /var/tmp/spdk-raid.sock 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1702767 ']' 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:39.437 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:39.437 07:56:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:39.696 [2024-07-15 07:56:24.234671] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:20:39.696 [2024-07-15 07:56:24.234723] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1702767 ] 00:20:39.696 [2024-07-15 07:56:24.323034] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:39.697 [2024-07-15 07:56:24.387099] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:39.697 [2024-07-15 07:56:24.430833] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:39.697 [2024-07-15 07:56:24.430858] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:40.637 07:56:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:40.637 07:56:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:40.637 07:56:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:40.637 07:56:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:40.637 BaseBdev1_malloc 00:20:40.637 07:56:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:40.897 true 00:20:40.897 07:56:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:40.897 [2024-07-15 07:56:25.605498] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:40.897 [2024-07-15 07:56:25.605529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:40.897 [2024-07-15 07:56:25.605539] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e7b50 00:20:40.897 [2024-07-15 07:56:25.605546] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:40.897 [2024-07-15 07:56:25.606834] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:40.897 [2024-07-15 07:56:25.606853] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:40.897 BaseBdev1 00:20:40.897 07:56:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:40.897 07:56:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:41.157 BaseBdev2_malloc 00:20:41.157 07:56:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:41.416 true 00:20:41.416 07:56:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:41.416 [2024-07-15 07:56:26.164761] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:41.416 [2024-07-15 07:56:26.164789] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:41.416 [2024-07-15 07:56:26.164801] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23cbea0 00:20:41.416 [2024-07-15 07:56:26.164807] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:41.416 [2024-07-15 07:56:26.165987] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:41.416 [2024-07-15 07:56:26.166006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:41.416 BaseBdev2 00:20:41.676 07:56:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:41.676 07:56:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:41.676 BaseBdev3_malloc 00:20:41.676 07:56:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:41.936 true 00:20:41.936 07:56:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:42.196 [2024-07-15 07:56:26.732112] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:42.196 [2024-07-15 07:56:26.732141] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:42.196 [2024-07-15 07:56:26.732153] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23cffb0 00:20:42.196 [2024-07-15 07:56:26.732160] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:42.196 [2024-07-15 07:56:26.733336] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:42.196 [2024-07-15 07:56:26.733355] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:42.196 BaseBdev3 00:20:42.196 07:56:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:42.196 07:56:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:42.196 BaseBdev4_malloc 00:20:42.196 07:56:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:42.456 true 00:20:42.456 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:42.716 [2024-07-15 07:56:27.299463] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:42.716 [2024-07-15 07:56:27.299492] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:42.716 [2024-07-15 07:56:27.299503] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d1980 00:20:42.716 [2024-07-15 07:56:27.299510] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:42.716 [2024-07-15 07:56:27.300698] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:42.716 [2024-07-15 07:56:27.300723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:42.716 BaseBdev4 00:20:42.716 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:42.976 [2024-07-15 07:56:27.487960] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:42.976 [2024-07-15 07:56:27.488971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:42.976 [2024-07-15 07:56:27.489023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:42.976 [2024-07-15 07:56:27.489068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:42.976 [2024-07-15 07:56:27.489243] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23d14e0 00:20:42.976 [2024-07-15 07:56:27.489250] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:42.976 [2024-07-15 07:56:27.489397] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2233210 00:20:42.976 [2024-07-15 07:56:27.489516] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23d14e0 00:20:42.976 [2024-07-15 07:56:27.489522] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23d14e0 00:20:42.976 [2024-07-15 07:56:27.489595] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.976 "name": "raid_bdev1", 00:20:42.976 "uuid": "be50ffce-3923-4db7-8637-831f666e0638", 00:20:42.976 "strip_size_kb": 0, 00:20:42.976 "state": "online", 00:20:42.976 "raid_level": "raid1", 00:20:42.976 "superblock": true, 00:20:42.976 "num_base_bdevs": 4, 00:20:42.976 "num_base_bdevs_discovered": 4, 00:20:42.976 "num_base_bdevs_operational": 4, 00:20:42.976 "base_bdevs_list": [ 00:20:42.976 { 00:20:42.976 "name": "BaseBdev1", 00:20:42.976 "uuid": "ef1ba0c5-8920-5007-844a-13a6676ac059", 00:20:42.976 "is_configured": true, 00:20:42.976 "data_offset": 2048, 00:20:42.976 "data_size": 63488 00:20:42.976 }, 00:20:42.976 { 00:20:42.976 "name": "BaseBdev2", 00:20:42.976 "uuid": "3e1c3d98-c33d-595a-a88f-3f90b228c873", 00:20:42.976 "is_configured": true, 00:20:42.976 "data_offset": 2048, 00:20:42.976 "data_size": 63488 00:20:42.976 }, 00:20:42.976 { 00:20:42.976 "name": "BaseBdev3", 00:20:42.976 "uuid": "4e4aa2db-da89-5e24-8d49-21c56468771b", 00:20:42.976 "is_configured": true, 00:20:42.976 "data_offset": 2048, 00:20:42.976 "data_size": 63488 00:20:42.976 }, 00:20:42.976 { 00:20:42.976 "name": "BaseBdev4", 00:20:42.976 "uuid": "44316f8d-25ec-51ad-b155-61c0af27491d", 00:20:42.976 "is_configured": true, 00:20:42.976 "data_offset": 2048, 00:20:42.976 "data_size": 63488 00:20:42.976 } 00:20:42.976 ] 00:20:42.976 }' 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.976 07:56:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:43.547 07:56:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:43.547 07:56:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:43.547 [2024-07-15 07:56:28.282176] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23d6a50 00:20:44.487 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:44.747 [2024-07-15 07:56:29.377312] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:20:44.747 [2024-07-15 07:56:29.377351] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:44.747 [2024-07-15 07:56:29.377541] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x23d6a50 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.747 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.007 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.007 "name": "raid_bdev1", 00:20:45.007 "uuid": "be50ffce-3923-4db7-8637-831f666e0638", 00:20:45.007 "strip_size_kb": 0, 00:20:45.007 "state": "online", 00:20:45.007 "raid_level": "raid1", 00:20:45.007 "superblock": true, 00:20:45.007 "num_base_bdevs": 4, 00:20:45.007 "num_base_bdevs_discovered": 3, 00:20:45.007 "num_base_bdevs_operational": 3, 00:20:45.007 "base_bdevs_list": [ 00:20:45.007 { 00:20:45.007 "name": null, 00:20:45.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.007 "is_configured": false, 00:20:45.007 "data_offset": 2048, 00:20:45.007 "data_size": 63488 00:20:45.007 }, 00:20:45.007 { 00:20:45.007 "name": "BaseBdev2", 00:20:45.007 "uuid": "3e1c3d98-c33d-595a-a88f-3f90b228c873", 00:20:45.007 "is_configured": true, 00:20:45.007 "data_offset": 2048, 00:20:45.007 "data_size": 63488 00:20:45.007 }, 00:20:45.007 { 00:20:45.007 "name": "BaseBdev3", 00:20:45.007 "uuid": "4e4aa2db-da89-5e24-8d49-21c56468771b", 00:20:45.007 "is_configured": true, 00:20:45.007 "data_offset": 2048, 00:20:45.007 "data_size": 63488 00:20:45.007 }, 00:20:45.007 { 00:20:45.007 "name": "BaseBdev4", 00:20:45.007 "uuid": "44316f8d-25ec-51ad-b155-61c0af27491d", 00:20:45.007 "is_configured": true, 00:20:45.007 "data_offset": 2048, 00:20:45.007 "data_size": 63488 00:20:45.007 } 00:20:45.007 ] 00:20:45.007 }' 00:20:45.007 07:56:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.007 07:56:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.578 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:45.578 [2024-07-15 07:56:30.322511] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:45.578 [2024-07-15 07:56:30.322538] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:45.578 [2024-07-15 07:56:30.325160] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:45.578 [2024-07-15 07:56:30.325188] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:45.578 [2024-07-15 07:56:30.325262] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:45.578 [2024-07-15 07:56:30.325268] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23d14e0 name raid_bdev1, state offline 00:20:45.578 0 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1702767 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1702767 ']' 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1702767 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1702767 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1702767' 00:20:45.843 killing process with pid 1702767 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1702767 00:20:45.843 [2024-07-15 07:56:30.404719] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1702767 00:20:45.843 [2024-07-15 07:56:30.421929] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.VDO5KAIgmF 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:20:45.843 00:20:45.843 real 0m6.386s 00:20:45.843 user 0m10.317s 00:20:45.843 sys 0m0.863s 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:45.843 07:56:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.843 ************************************ 00:20:45.843 END TEST raid_write_error_test 00:20:45.843 ************************************ 00:20:45.843 07:56:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:45.843 07:56:30 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:20:45.843 07:56:30 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:20:45.843 07:56:30 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:20:46.178 07:56:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:20:46.178 07:56:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:46.178 07:56:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:46.178 ************************************ 00:20:46.178 START TEST raid_rebuild_test 00:20:46.178 ************************************ 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1703910 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1703910 /var/tmp/spdk-raid.sock 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1703910 ']' 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:46.178 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:46.178 07:56:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:46.178 [2024-07-15 07:56:30.697582] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:20:46.178 [2024-07-15 07:56:30.697636] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1703910 ] 00:20:46.178 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:46.178 Zero copy mechanism will not be used. 00:20:46.178 [2024-07-15 07:56:30.792676] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:46.178 [2024-07-15 07:56:30.867635] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:46.178 [2024-07-15 07:56:30.907898] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:46.178 [2024-07-15 07:56:30.907922] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:47.116 07:56:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:47.116 07:56:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:20:47.116 07:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:47.116 07:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:47.116 BaseBdev1_malloc 00:20:47.116 07:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:47.376 [2024-07-15 07:56:31.895023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:47.376 [2024-07-15 07:56:31.895058] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.376 [2024-07-15 07:56:31.895070] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a1d30 00:20:47.376 [2024-07-15 07:56:31.895077] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.376 [2024-07-15 07:56:31.896329] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.376 [2024-07-15 07:56:31.896349] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:47.376 BaseBdev1 00:20:47.376 07:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:47.376 07:56:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:47.376 BaseBdev2_malloc 00:20:47.376 07:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:47.637 [2024-07-15 07:56:32.265768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:47.637 [2024-07-15 07:56:32.265792] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.637 [2024-07-15 07:56:32.265803] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2754c60 00:20:47.637 [2024-07-15 07:56:32.265810] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.637 [2024-07-15 07:56:32.266958] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.637 [2024-07-15 07:56:32.266977] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:47.637 BaseBdev2 00:20:47.637 07:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:47.897 spare_malloc 00:20:47.897 07:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:47.897 spare_delay 00:20:48.157 07:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:48.157 [2024-07-15 07:56:32.844937] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:48.157 [2024-07-15 07:56:32.844968] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:48.157 [2024-07-15 07:56:32.844980] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2744ec0 00:20:48.157 [2024-07-15 07:56:32.844987] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:48.157 [2024-07-15 07:56:32.846159] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:48.157 [2024-07-15 07:56:32.846178] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:48.157 spare 00:20:48.157 07:56:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:48.417 [2024-07-15 07:56:33.037435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:48.417 [2024-07-15 07:56:33.038446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:48.417 [2024-07-15 07:56:33.038500] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x273c390 00:20:48.417 [2024-07-15 07:56:33.038506] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:48.417 [2024-07-15 07:56:33.038648] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259c7f0 00:20:48.417 [2024-07-15 07:56:33.038765] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x273c390 00:20:48.417 [2024-07-15 07:56:33.038771] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x273c390 00:20:48.417 [2024-07-15 07:56:33.038848] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.417 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.984 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.984 "name": "raid_bdev1", 00:20:48.984 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:20:48.984 "strip_size_kb": 0, 00:20:48.984 "state": "online", 00:20:48.984 "raid_level": "raid1", 00:20:48.984 "superblock": false, 00:20:48.984 "num_base_bdevs": 2, 00:20:48.984 "num_base_bdevs_discovered": 2, 00:20:48.984 "num_base_bdevs_operational": 2, 00:20:48.984 "base_bdevs_list": [ 00:20:48.984 { 00:20:48.984 "name": "BaseBdev1", 00:20:48.984 "uuid": "1cffac56-ad0a-547d-8126-7ca7c441eb9c", 00:20:48.984 "is_configured": true, 00:20:48.984 "data_offset": 0, 00:20:48.984 "data_size": 65536 00:20:48.984 }, 00:20:48.984 { 00:20:48.984 "name": "BaseBdev2", 00:20:48.984 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:20:48.984 "is_configured": true, 00:20:48.984 "data_offset": 0, 00:20:48.984 "data_size": 65536 00:20:48.984 } 00:20:48.984 ] 00:20:48.984 }' 00:20:48.984 07:56:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.984 07:56:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:49.552 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:49.552 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:49.812 [2024-07-15 07:56:34.312869] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:49.812 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:50.072 [2024-07-15 07:56:34.705687] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x273b3c0 00:20:50.072 /dev/nbd0 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:50.072 1+0 records in 00:20:50.072 1+0 records out 00:20:50.072 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261148 s, 15.7 MB/s 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:20:50.072 07:56:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:20:55.357 65536+0 records in 00:20:55.357 65536+0 records out 00:20:55.357 33554432 bytes (34 MB, 32 MiB) copied, 4.63171 s, 7.2 MB/s 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:55.357 [2024-07-15 07:56:39.592368] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:55.357 [2024-07-15 07:56:39.769053] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.357 "name": "raid_bdev1", 00:20:55.357 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:20:55.357 "strip_size_kb": 0, 00:20:55.357 "state": "online", 00:20:55.357 "raid_level": "raid1", 00:20:55.357 "superblock": false, 00:20:55.357 "num_base_bdevs": 2, 00:20:55.357 "num_base_bdevs_discovered": 1, 00:20:55.357 "num_base_bdevs_operational": 1, 00:20:55.357 "base_bdevs_list": [ 00:20:55.357 { 00:20:55.357 "name": null, 00:20:55.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.357 "is_configured": false, 00:20:55.357 "data_offset": 0, 00:20:55.357 "data_size": 65536 00:20:55.357 }, 00:20:55.357 { 00:20:55.357 "name": "BaseBdev2", 00:20:55.357 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:20:55.357 "is_configured": true, 00:20:55.357 "data_offset": 0, 00:20:55.357 "data_size": 65536 00:20:55.357 } 00:20:55.357 ] 00:20:55.357 }' 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.357 07:56:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:55.927 07:56:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:56.188 [2024-07-15 07:56:40.703421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:56.188 [2024-07-15 07:56:40.706843] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259c720 00:20:56.188 [2024-07-15 07:56:40.708402] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:56.188 07:56:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:57.129 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:57.129 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:57.129 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:57.129 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:57.129 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:57.129 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.129 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.389 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:57.389 "name": "raid_bdev1", 00:20:57.389 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:20:57.389 "strip_size_kb": 0, 00:20:57.389 "state": "online", 00:20:57.389 "raid_level": "raid1", 00:20:57.389 "superblock": false, 00:20:57.389 "num_base_bdevs": 2, 00:20:57.389 "num_base_bdevs_discovered": 2, 00:20:57.389 "num_base_bdevs_operational": 2, 00:20:57.389 "process": { 00:20:57.389 "type": "rebuild", 00:20:57.389 "target": "spare", 00:20:57.389 "progress": { 00:20:57.389 "blocks": 22528, 00:20:57.389 "percent": 34 00:20:57.389 } 00:20:57.389 }, 00:20:57.389 "base_bdevs_list": [ 00:20:57.389 { 00:20:57.389 "name": "spare", 00:20:57.389 "uuid": "8a421be5-4ca1-53aa-9de7-2cd5e0b93690", 00:20:57.389 "is_configured": true, 00:20:57.389 "data_offset": 0, 00:20:57.389 "data_size": 65536 00:20:57.389 }, 00:20:57.389 { 00:20:57.389 "name": "BaseBdev2", 00:20:57.389 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:20:57.389 "is_configured": true, 00:20:57.389 "data_offset": 0, 00:20:57.390 "data_size": 65536 00:20:57.390 } 00:20:57.390 ] 00:20:57.390 }' 00:20:57.390 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:57.390 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:57.390 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:57.390 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:57.390 07:56:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:57.650 [2024-07-15 07:56:42.164567] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:57.650 [2024-07-15 07:56:42.217319] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:57.650 [2024-07-15 07:56:42.217349] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:57.650 [2024-07-15 07:56:42.217359] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:57.650 [2024-07-15 07:56:42.217363] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.650 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.910 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.910 "name": "raid_bdev1", 00:20:57.910 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:20:57.910 "strip_size_kb": 0, 00:20:57.910 "state": "online", 00:20:57.910 "raid_level": "raid1", 00:20:57.910 "superblock": false, 00:20:57.910 "num_base_bdevs": 2, 00:20:57.910 "num_base_bdevs_discovered": 1, 00:20:57.910 "num_base_bdevs_operational": 1, 00:20:57.910 "base_bdevs_list": [ 00:20:57.910 { 00:20:57.910 "name": null, 00:20:57.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.910 "is_configured": false, 00:20:57.910 "data_offset": 0, 00:20:57.910 "data_size": 65536 00:20:57.910 }, 00:20:57.910 { 00:20:57.910 "name": "BaseBdev2", 00:20:57.910 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:20:57.910 "is_configured": true, 00:20:57.910 "data_offset": 0, 00:20:57.910 "data_size": 65536 00:20:57.910 } 00:20:57.910 ] 00:20:57.910 }' 00:20:57.910 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.910 07:56:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:58.480 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:58.480 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:58.480 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:58.480 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:58.480 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:58.480 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.480 07:56:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.480 07:56:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:58.480 "name": "raid_bdev1", 00:20:58.480 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:20:58.480 "strip_size_kb": 0, 00:20:58.480 "state": "online", 00:20:58.480 "raid_level": "raid1", 00:20:58.480 "superblock": false, 00:20:58.480 "num_base_bdevs": 2, 00:20:58.480 "num_base_bdevs_discovered": 1, 00:20:58.480 "num_base_bdevs_operational": 1, 00:20:58.480 "base_bdevs_list": [ 00:20:58.480 { 00:20:58.480 "name": null, 00:20:58.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.480 "is_configured": false, 00:20:58.480 "data_offset": 0, 00:20:58.480 "data_size": 65536 00:20:58.480 }, 00:20:58.480 { 00:20:58.480 "name": "BaseBdev2", 00:20:58.480 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:20:58.480 "is_configured": true, 00:20:58.480 "data_offset": 0, 00:20:58.480 "data_size": 65536 00:20:58.480 } 00:20:58.480 ] 00:20:58.480 }' 00:20:58.480 07:56:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:58.480 07:56:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:58.480 07:56:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:58.740 07:56:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:58.740 07:56:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:58.740 [2024-07-15 07:56:43.420282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:58.740 [2024-07-15 07:56:43.423681] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2599c70 00:20:58.740 [2024-07-15 07:56:43.424821] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:58.740 07:56:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:00.120 "name": "raid_bdev1", 00:21:00.120 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:21:00.120 "strip_size_kb": 0, 00:21:00.120 "state": "online", 00:21:00.120 "raid_level": "raid1", 00:21:00.120 "superblock": false, 00:21:00.120 "num_base_bdevs": 2, 00:21:00.120 "num_base_bdevs_discovered": 2, 00:21:00.120 "num_base_bdevs_operational": 2, 00:21:00.120 "process": { 00:21:00.120 "type": "rebuild", 00:21:00.120 "target": "spare", 00:21:00.120 "progress": { 00:21:00.120 "blocks": 22528, 00:21:00.120 "percent": 34 00:21:00.120 } 00:21:00.120 }, 00:21:00.120 "base_bdevs_list": [ 00:21:00.120 { 00:21:00.120 "name": "spare", 00:21:00.120 "uuid": "8a421be5-4ca1-53aa-9de7-2cd5e0b93690", 00:21:00.120 "is_configured": true, 00:21:00.120 "data_offset": 0, 00:21:00.120 "data_size": 65536 00:21:00.120 }, 00:21:00.120 { 00:21:00.120 "name": "BaseBdev2", 00:21:00.120 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:21:00.120 "is_configured": true, 00:21:00.120 "data_offset": 0, 00:21:00.120 "data_size": 65536 00:21:00.120 } 00:21:00.120 ] 00:21:00.120 }' 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=670 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:00.120 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.379 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:00.379 "name": "raid_bdev1", 00:21:00.379 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:21:00.379 "strip_size_kb": 0, 00:21:00.379 "state": "online", 00:21:00.379 "raid_level": "raid1", 00:21:00.379 "superblock": false, 00:21:00.379 "num_base_bdevs": 2, 00:21:00.379 "num_base_bdevs_discovered": 2, 00:21:00.379 "num_base_bdevs_operational": 2, 00:21:00.379 "process": { 00:21:00.379 "type": "rebuild", 00:21:00.379 "target": "spare", 00:21:00.379 "progress": { 00:21:00.379 "blocks": 28672, 00:21:00.379 "percent": 43 00:21:00.379 } 00:21:00.379 }, 00:21:00.379 "base_bdevs_list": [ 00:21:00.379 { 00:21:00.379 "name": "spare", 00:21:00.379 "uuid": "8a421be5-4ca1-53aa-9de7-2cd5e0b93690", 00:21:00.379 "is_configured": true, 00:21:00.379 "data_offset": 0, 00:21:00.379 "data_size": 65536 00:21:00.379 }, 00:21:00.379 { 00:21:00.379 "name": "BaseBdev2", 00:21:00.379 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:21:00.379 "is_configured": true, 00:21:00.379 "data_offset": 0, 00:21:00.379 "data_size": 65536 00:21:00.379 } 00:21:00.379 ] 00:21:00.379 }' 00:21:00.379 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:00.379 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:00.379 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:00.379 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:00.379 07:56:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:01.317 07:56:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:01.317 07:56:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:01.317 07:56:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:01.317 07:56:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:01.317 07:56:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:01.317 07:56:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:01.317 07:56:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.317 07:56:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.577 07:56:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:01.577 "name": "raid_bdev1", 00:21:01.577 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:21:01.577 "strip_size_kb": 0, 00:21:01.577 "state": "online", 00:21:01.577 "raid_level": "raid1", 00:21:01.577 "superblock": false, 00:21:01.577 "num_base_bdevs": 2, 00:21:01.577 "num_base_bdevs_discovered": 2, 00:21:01.577 "num_base_bdevs_operational": 2, 00:21:01.577 "process": { 00:21:01.577 "type": "rebuild", 00:21:01.577 "target": "spare", 00:21:01.577 "progress": { 00:21:01.577 "blocks": 55296, 00:21:01.577 "percent": 84 00:21:01.577 } 00:21:01.577 }, 00:21:01.577 "base_bdevs_list": [ 00:21:01.577 { 00:21:01.577 "name": "spare", 00:21:01.577 "uuid": "8a421be5-4ca1-53aa-9de7-2cd5e0b93690", 00:21:01.577 "is_configured": true, 00:21:01.577 "data_offset": 0, 00:21:01.577 "data_size": 65536 00:21:01.577 }, 00:21:01.577 { 00:21:01.577 "name": "BaseBdev2", 00:21:01.577 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:21:01.577 "is_configured": true, 00:21:01.577 "data_offset": 0, 00:21:01.577 "data_size": 65536 00:21:01.577 } 00:21:01.577 ] 00:21:01.577 }' 00:21:01.577 07:56:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:01.577 07:56:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:01.577 07:56:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:01.577 07:56:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:01.577 07:56:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:02.144 [2024-07-15 07:56:46.643545] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:02.144 [2024-07-15 07:56:46.643591] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:02.144 [2024-07-15 07:56:46.643617] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.713 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:02.713 "name": "raid_bdev1", 00:21:02.713 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:21:02.713 "strip_size_kb": 0, 00:21:02.713 "state": "online", 00:21:02.713 "raid_level": "raid1", 00:21:02.713 "superblock": false, 00:21:02.713 "num_base_bdevs": 2, 00:21:02.713 "num_base_bdevs_discovered": 2, 00:21:02.713 "num_base_bdevs_operational": 2, 00:21:02.713 "base_bdevs_list": [ 00:21:02.713 { 00:21:02.713 "name": "spare", 00:21:02.713 "uuid": "8a421be5-4ca1-53aa-9de7-2cd5e0b93690", 00:21:02.713 "is_configured": true, 00:21:02.713 "data_offset": 0, 00:21:02.713 "data_size": 65536 00:21:02.713 }, 00:21:02.713 { 00:21:02.713 "name": "BaseBdev2", 00:21:02.713 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:21:02.713 "is_configured": true, 00:21:02.713 "data_offset": 0, 00:21:02.713 "data_size": 65536 00:21:02.713 } 00:21:02.713 ] 00:21:02.713 }' 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.973 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:03.233 "name": "raid_bdev1", 00:21:03.233 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:21:03.233 "strip_size_kb": 0, 00:21:03.233 "state": "online", 00:21:03.233 "raid_level": "raid1", 00:21:03.233 "superblock": false, 00:21:03.233 "num_base_bdevs": 2, 00:21:03.233 "num_base_bdevs_discovered": 2, 00:21:03.233 "num_base_bdevs_operational": 2, 00:21:03.233 "base_bdevs_list": [ 00:21:03.233 { 00:21:03.233 "name": "spare", 00:21:03.233 "uuid": "8a421be5-4ca1-53aa-9de7-2cd5e0b93690", 00:21:03.233 "is_configured": true, 00:21:03.233 "data_offset": 0, 00:21:03.233 "data_size": 65536 00:21:03.233 }, 00:21:03.233 { 00:21:03.233 "name": "BaseBdev2", 00:21:03.233 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:21:03.233 "is_configured": true, 00:21:03.233 "data_offset": 0, 00:21:03.233 "data_size": 65536 00:21:03.233 } 00:21:03.233 ] 00:21:03.233 }' 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:03.233 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.234 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.234 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:03.234 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.234 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.234 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.234 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.234 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.234 07:56:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:03.493 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.494 "name": "raid_bdev1", 00:21:03.494 "uuid": "6822656e-d4ec-4511-95a4-dd5b0ce57935", 00:21:03.494 "strip_size_kb": 0, 00:21:03.494 "state": "online", 00:21:03.494 "raid_level": "raid1", 00:21:03.494 "superblock": false, 00:21:03.494 "num_base_bdevs": 2, 00:21:03.494 "num_base_bdevs_discovered": 2, 00:21:03.494 "num_base_bdevs_operational": 2, 00:21:03.494 "base_bdevs_list": [ 00:21:03.494 { 00:21:03.494 "name": "spare", 00:21:03.494 "uuid": "8a421be5-4ca1-53aa-9de7-2cd5e0b93690", 00:21:03.494 "is_configured": true, 00:21:03.494 "data_offset": 0, 00:21:03.494 "data_size": 65536 00:21:03.494 }, 00:21:03.494 { 00:21:03.494 "name": "BaseBdev2", 00:21:03.494 "uuid": "173545c1-2110-5b7c-a356-7d9d28da2c81", 00:21:03.494 "is_configured": true, 00:21:03.494 "data_offset": 0, 00:21:03.494 "data_size": 65536 00:21:03.494 } 00:21:03.494 ] 00:21:03.494 }' 00:21:03.494 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.494 07:56:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:04.095 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:04.095 [2024-07-15 07:56:48.784162] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:04.095 [2024-07-15 07:56:48.784181] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:04.095 [2024-07-15 07:56:48.784228] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:04.095 [2024-07-15 07:56:48.784268] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:04.095 [2024-07-15 07:56:48.784274] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x273c390 name raid_bdev1, state offline 00:21:04.095 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.095 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:04.355 07:56:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:04.616 /dev/nbd0 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:04.616 1+0 records in 00:21:04.616 1+0 records out 00:21:04.616 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189002 s, 21.7 MB/s 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:04.616 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:04.876 /dev/nbd1 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:04.876 1+0 records in 00:21:04.876 1+0 records out 00:21:04.876 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274071 s, 14.9 MB/s 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:04.876 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:05.137 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1703910 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1703910 ']' 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1703910 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1703910 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1703910' 00:21:05.397 killing process with pid 1703910 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1703910 00:21:05.397 Received shutdown signal, test time was about 60.000000 seconds 00:21:05.397 00:21:05.397 Latency(us) 00:21:05.397 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:05.397 =================================================================================================================== 00:21:05.397 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:05.397 [2024-07-15 07:56:50.000684] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:05.397 07:56:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1703910 00:21:05.397 [2024-07-15 07:56:50.017171] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:05.397 07:56:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:21:05.397 00:21:05.397 real 0m19.504s 00:21:05.397 user 0m27.014s 00:21:05.397 sys 0m3.751s 00:21:05.397 07:56:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:05.397 07:56:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:05.397 ************************************ 00:21:05.397 END TEST raid_rebuild_test 00:21:05.397 ************************************ 00:21:05.658 07:56:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:05.658 07:56:50 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:21:05.658 07:56:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:05.658 07:56:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:05.658 07:56:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:05.658 ************************************ 00:21:05.658 START TEST raid_rebuild_test_sb 00:21:05.658 ************************************ 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:05.658 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1707389 00:21:05.659 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1707389 /var/tmp/spdk-raid.sock 00:21:05.659 07:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1707389 ']' 00:21:05.659 07:56:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:05.659 07:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:05.659 07:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:05.659 07:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:05.659 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:05.659 07:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:05.659 07:56:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:05.659 [2024-07-15 07:56:50.276234] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:21:05.659 [2024-07-15 07:56:50.276283] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707389 ] 00:21:05.659 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:05.659 Zero copy mechanism will not be used. 00:21:05.659 [2024-07-15 07:56:50.364436] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:05.918 [2024-07-15 07:56:50.429844] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:05.918 [2024-07-15 07:56:50.478001] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:05.918 [2024-07-15 07:56:50.478026] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:06.486 07:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:06.486 07:56:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:06.486 07:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:06.486 07:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:06.745 BaseBdev1_malloc 00:21:06.745 07:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:06.745 [2024-07-15 07:56:51.436282] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:06.745 [2024-07-15 07:56:51.436319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.745 [2024-07-15 07:56:51.436332] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x199ed30 00:21:06.745 [2024-07-15 07:56:51.436339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.745 [2024-07-15 07:56:51.437717] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.745 [2024-07-15 07:56:51.437740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:06.745 BaseBdev1 00:21:06.745 07:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:06.746 07:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:07.005 BaseBdev2_malloc 00:21:07.005 07:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:07.265 [2024-07-15 07:56:51.818931] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:07.265 [2024-07-15 07:56:51.818956] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.265 [2024-07-15 07:56:51.818966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b51c60 00:21:07.265 [2024-07-15 07:56:51.818973] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.265 [2024-07-15 07:56:51.820127] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.265 [2024-07-15 07:56:51.820145] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:07.265 BaseBdev2 00:21:07.265 07:56:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:07.265 spare_malloc 00:21:07.525 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:07.525 spare_delay 00:21:07.525 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:07.785 [2024-07-15 07:56:52.369884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:07.785 [2024-07-15 07:56:52.369908] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.785 [2024-07-15 07:56:52.369918] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b41ec0 00:21:07.785 [2024-07-15 07:56:52.369925] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.785 [2024-07-15 07:56:52.371068] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.785 [2024-07-15 07:56:52.371085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:07.785 spare 00:21:07.785 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:08.045 [2024-07-15 07:56:52.550361] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:08.045 [2024-07-15 07:56:52.551330] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:08.045 [2024-07-15 07:56:52.551443] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b39390 00:21:08.045 [2024-07-15 07:56:52.551451] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:08.045 [2024-07-15 07:56:52.551588] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19957c0 00:21:08.045 [2024-07-15 07:56:52.551696] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b39390 00:21:08.045 [2024-07-15 07:56:52.551705] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b39390 00:21:08.045 [2024-07-15 07:56:52.551778] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:08.045 "name": "raid_bdev1", 00:21:08.045 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:08.045 "strip_size_kb": 0, 00:21:08.045 "state": "online", 00:21:08.045 "raid_level": "raid1", 00:21:08.045 "superblock": true, 00:21:08.045 "num_base_bdevs": 2, 00:21:08.045 "num_base_bdevs_discovered": 2, 00:21:08.045 "num_base_bdevs_operational": 2, 00:21:08.045 "base_bdevs_list": [ 00:21:08.045 { 00:21:08.045 "name": "BaseBdev1", 00:21:08.045 "uuid": "4eea1d0f-f53e-59b9-a55d-b367738f699f", 00:21:08.045 "is_configured": true, 00:21:08.045 "data_offset": 2048, 00:21:08.045 "data_size": 63488 00:21:08.045 }, 00:21:08.045 { 00:21:08.045 "name": "BaseBdev2", 00:21:08.045 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:08.045 "is_configured": true, 00:21:08.045 "data_offset": 2048, 00:21:08.045 "data_size": 63488 00:21:08.045 } 00:21:08.045 ] 00:21:08.045 }' 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:08.045 07:56:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:08.615 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:08.615 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:08.873 [2024-07-15 07:56:53.472882] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:08.873 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:21:08.873 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.874 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:09.132 [2024-07-15 07:56:53.861676] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b37b00 00:21:09.132 /dev/nbd0 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:09.132 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:09.392 1+0 records in 00:21:09.392 1+0 records out 00:21:09.392 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287018 s, 14.3 MB/s 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:09.392 07:56:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:21:15.979 63488+0 records in 00:21:15.979 63488+0 records out 00:21:15.979 32505856 bytes (33 MB, 31 MiB) copied, 6.51441 s, 5.0 MB/s 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:15.979 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:15.980 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:15.980 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:15.980 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:15.980 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:15.980 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:15.980 [2024-07-15 07:57:00.630444] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:16.240 [2024-07-15 07:57:00.790880] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.240 "name": "raid_bdev1", 00:21:16.240 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:16.240 "strip_size_kb": 0, 00:21:16.240 "state": "online", 00:21:16.240 "raid_level": "raid1", 00:21:16.240 "superblock": true, 00:21:16.240 "num_base_bdevs": 2, 00:21:16.240 "num_base_bdevs_discovered": 1, 00:21:16.240 "num_base_bdevs_operational": 1, 00:21:16.240 "base_bdevs_list": [ 00:21:16.240 { 00:21:16.240 "name": null, 00:21:16.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.240 "is_configured": false, 00:21:16.240 "data_offset": 2048, 00:21:16.240 "data_size": 63488 00:21:16.240 }, 00:21:16.240 { 00:21:16.240 "name": "BaseBdev2", 00:21:16.240 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:16.240 "is_configured": true, 00:21:16.240 "data_offset": 2048, 00:21:16.240 "data_size": 63488 00:21:16.240 } 00:21:16.240 ] 00:21:16.240 }' 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.240 07:57:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:16.813 07:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:17.073 [2024-07-15 07:57:01.673110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:17.073 [2024-07-15 07:57:01.676582] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b37aa0 00:21:17.073 [2024-07-15 07:57:01.678158] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:17.073 07:57:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:18.015 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:18.015 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:18.015 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:18.015 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:18.015 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:18.015 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.015 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.276 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:18.276 "name": "raid_bdev1", 00:21:18.276 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:18.276 "strip_size_kb": 0, 00:21:18.276 "state": "online", 00:21:18.276 "raid_level": "raid1", 00:21:18.276 "superblock": true, 00:21:18.276 "num_base_bdevs": 2, 00:21:18.276 "num_base_bdevs_discovered": 2, 00:21:18.276 "num_base_bdevs_operational": 2, 00:21:18.276 "process": { 00:21:18.276 "type": "rebuild", 00:21:18.276 "target": "spare", 00:21:18.276 "progress": { 00:21:18.276 "blocks": 22528, 00:21:18.276 "percent": 35 00:21:18.276 } 00:21:18.276 }, 00:21:18.276 "base_bdevs_list": [ 00:21:18.276 { 00:21:18.276 "name": "spare", 00:21:18.276 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:18.276 "is_configured": true, 00:21:18.276 "data_offset": 2048, 00:21:18.276 "data_size": 63488 00:21:18.276 }, 00:21:18.276 { 00:21:18.276 "name": "BaseBdev2", 00:21:18.276 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:18.276 "is_configured": true, 00:21:18.276 "data_offset": 2048, 00:21:18.276 "data_size": 63488 00:21:18.276 } 00:21:18.276 ] 00:21:18.276 }' 00:21:18.276 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:18.277 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:18.277 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:18.277 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:18.277 07:57:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:18.537 [2024-07-15 07:57:03.134360] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:18.537 [2024-07-15 07:57:03.187066] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:18.537 [2024-07-15 07:57:03.187098] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:18.537 [2024-07-15 07:57:03.187108] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:18.537 [2024-07-15 07:57:03.187113] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:18.537 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:18.537 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:18.537 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:18.537 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:18.537 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:18.537 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:18.537 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.537 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.538 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.538 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.538 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.538 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.797 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.797 "name": "raid_bdev1", 00:21:18.797 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:18.797 "strip_size_kb": 0, 00:21:18.797 "state": "online", 00:21:18.797 "raid_level": "raid1", 00:21:18.797 "superblock": true, 00:21:18.797 "num_base_bdevs": 2, 00:21:18.797 "num_base_bdevs_discovered": 1, 00:21:18.797 "num_base_bdevs_operational": 1, 00:21:18.797 "base_bdevs_list": [ 00:21:18.797 { 00:21:18.797 "name": null, 00:21:18.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.797 "is_configured": false, 00:21:18.797 "data_offset": 2048, 00:21:18.797 "data_size": 63488 00:21:18.797 }, 00:21:18.797 { 00:21:18.797 "name": "BaseBdev2", 00:21:18.797 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:18.797 "is_configured": true, 00:21:18.797 "data_offset": 2048, 00:21:18.797 "data_size": 63488 00:21:18.797 } 00:21:18.797 ] 00:21:18.797 }' 00:21:18.797 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.797 07:57:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:19.368 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:19.368 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:19.368 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:19.368 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:19.368 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:19.368 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.368 07:57:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.628 07:57:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:19.628 "name": "raid_bdev1", 00:21:19.628 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:19.628 "strip_size_kb": 0, 00:21:19.628 "state": "online", 00:21:19.628 "raid_level": "raid1", 00:21:19.628 "superblock": true, 00:21:19.628 "num_base_bdevs": 2, 00:21:19.628 "num_base_bdevs_discovered": 1, 00:21:19.628 "num_base_bdevs_operational": 1, 00:21:19.628 "base_bdevs_list": [ 00:21:19.628 { 00:21:19.628 "name": null, 00:21:19.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.628 "is_configured": false, 00:21:19.628 "data_offset": 2048, 00:21:19.628 "data_size": 63488 00:21:19.628 }, 00:21:19.628 { 00:21:19.628 "name": "BaseBdev2", 00:21:19.628 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:19.628 "is_configured": true, 00:21:19.628 "data_offset": 2048, 00:21:19.628 "data_size": 63488 00:21:19.628 } 00:21:19.628 ] 00:21:19.628 }' 00:21:19.628 07:57:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:19.628 07:57:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:19.628 07:57:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:19.628 07:57:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:19.628 07:57:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:19.888 [2024-07-15 07:57:04.406027] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:19.888 [2024-07-15 07:57:04.409419] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b379e0 00:21:19.888 [2024-07-15 07:57:04.410552] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:19.888 07:57:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:20.828 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:20.828 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:20.828 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:20.828 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:20.828 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:20.828 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.828 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:21.089 "name": "raid_bdev1", 00:21:21.089 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:21.089 "strip_size_kb": 0, 00:21:21.089 "state": "online", 00:21:21.089 "raid_level": "raid1", 00:21:21.089 "superblock": true, 00:21:21.089 "num_base_bdevs": 2, 00:21:21.089 "num_base_bdevs_discovered": 2, 00:21:21.089 "num_base_bdevs_operational": 2, 00:21:21.089 "process": { 00:21:21.089 "type": "rebuild", 00:21:21.089 "target": "spare", 00:21:21.089 "progress": { 00:21:21.089 "blocks": 22528, 00:21:21.089 "percent": 35 00:21:21.089 } 00:21:21.089 }, 00:21:21.089 "base_bdevs_list": [ 00:21:21.089 { 00:21:21.089 "name": "spare", 00:21:21.089 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:21.089 "is_configured": true, 00:21:21.089 "data_offset": 2048, 00:21:21.089 "data_size": 63488 00:21:21.089 }, 00:21:21.089 { 00:21:21.089 "name": "BaseBdev2", 00:21:21.089 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:21.089 "is_configured": true, 00:21:21.089 "data_offset": 2048, 00:21:21.089 "data_size": 63488 00:21:21.089 } 00:21:21.089 ] 00:21:21.089 }' 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:21:21.089 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=691 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.089 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.349 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:21.349 "name": "raid_bdev1", 00:21:21.349 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:21.349 "strip_size_kb": 0, 00:21:21.349 "state": "online", 00:21:21.349 "raid_level": "raid1", 00:21:21.349 "superblock": true, 00:21:21.349 "num_base_bdevs": 2, 00:21:21.349 "num_base_bdevs_discovered": 2, 00:21:21.349 "num_base_bdevs_operational": 2, 00:21:21.349 "process": { 00:21:21.349 "type": "rebuild", 00:21:21.349 "target": "spare", 00:21:21.349 "progress": { 00:21:21.349 "blocks": 28672, 00:21:21.349 "percent": 45 00:21:21.349 } 00:21:21.349 }, 00:21:21.349 "base_bdevs_list": [ 00:21:21.349 { 00:21:21.349 "name": "spare", 00:21:21.349 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:21.349 "is_configured": true, 00:21:21.349 "data_offset": 2048, 00:21:21.349 "data_size": 63488 00:21:21.349 }, 00:21:21.349 { 00:21:21.349 "name": "BaseBdev2", 00:21:21.349 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:21.349 "is_configured": true, 00:21:21.349 "data_offset": 2048, 00:21:21.349 "data_size": 63488 00:21:21.349 } 00:21:21.349 ] 00:21:21.349 }' 00:21:21.349 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:21.349 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:21.349 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:21.349 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:21.349 07:57:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:22.289 07:57:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:22.289 07:57:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:22.289 07:57:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:22.289 07:57:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:22.289 07:57:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:22.289 07:57:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:22.289 07:57:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.289 07:57:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.554 07:57:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:22.554 "name": "raid_bdev1", 00:21:22.554 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:22.554 "strip_size_kb": 0, 00:21:22.554 "state": "online", 00:21:22.554 "raid_level": "raid1", 00:21:22.554 "superblock": true, 00:21:22.554 "num_base_bdevs": 2, 00:21:22.554 "num_base_bdevs_discovered": 2, 00:21:22.554 "num_base_bdevs_operational": 2, 00:21:22.554 "process": { 00:21:22.554 "type": "rebuild", 00:21:22.554 "target": "spare", 00:21:22.554 "progress": { 00:21:22.554 "blocks": 55296, 00:21:22.554 "percent": 87 00:21:22.554 } 00:21:22.554 }, 00:21:22.554 "base_bdevs_list": [ 00:21:22.554 { 00:21:22.554 "name": "spare", 00:21:22.554 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:22.554 "is_configured": true, 00:21:22.554 "data_offset": 2048, 00:21:22.554 "data_size": 63488 00:21:22.554 }, 00:21:22.554 { 00:21:22.554 "name": "BaseBdev2", 00:21:22.554 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:22.554 "is_configured": true, 00:21:22.554 "data_offset": 2048, 00:21:22.554 "data_size": 63488 00:21:22.554 } 00:21:22.554 ] 00:21:22.554 }' 00:21:22.554 07:57:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:22.554 07:57:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:22.554 07:57:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:22.554 07:57:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:22.554 07:57:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:22.856 [2024-07-15 07:57:07.528606] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:22.856 [2024-07-15 07:57:07.528651] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:22.856 [2024-07-15 07:57:07.528717] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:23.797 "name": "raid_bdev1", 00:21:23.797 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:23.797 "strip_size_kb": 0, 00:21:23.797 "state": "online", 00:21:23.797 "raid_level": "raid1", 00:21:23.797 "superblock": true, 00:21:23.797 "num_base_bdevs": 2, 00:21:23.797 "num_base_bdevs_discovered": 2, 00:21:23.797 "num_base_bdevs_operational": 2, 00:21:23.797 "base_bdevs_list": [ 00:21:23.797 { 00:21:23.797 "name": "spare", 00:21:23.797 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:23.797 "is_configured": true, 00:21:23.797 "data_offset": 2048, 00:21:23.797 "data_size": 63488 00:21:23.797 }, 00:21:23.797 { 00:21:23.797 "name": "BaseBdev2", 00:21:23.797 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:23.797 "is_configured": true, 00:21:23.797 "data_offset": 2048, 00:21:23.797 "data_size": 63488 00:21:23.797 } 00:21:23.797 ] 00:21:23.797 }' 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:23.797 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:24.058 "name": "raid_bdev1", 00:21:24.058 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:24.058 "strip_size_kb": 0, 00:21:24.058 "state": "online", 00:21:24.058 "raid_level": "raid1", 00:21:24.058 "superblock": true, 00:21:24.058 "num_base_bdevs": 2, 00:21:24.058 "num_base_bdevs_discovered": 2, 00:21:24.058 "num_base_bdevs_operational": 2, 00:21:24.058 "base_bdevs_list": [ 00:21:24.058 { 00:21:24.058 "name": "spare", 00:21:24.058 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:24.058 "is_configured": true, 00:21:24.058 "data_offset": 2048, 00:21:24.058 "data_size": 63488 00:21:24.058 }, 00:21:24.058 { 00:21:24.058 "name": "BaseBdev2", 00:21:24.058 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:24.058 "is_configured": true, 00:21:24.058 "data_offset": 2048, 00:21:24.058 "data_size": 63488 00:21:24.058 } 00:21:24.058 ] 00:21:24.058 }' 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:24.058 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.319 07:57:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.319 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:24.319 "name": "raid_bdev1", 00:21:24.319 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:24.319 "strip_size_kb": 0, 00:21:24.319 "state": "online", 00:21:24.319 "raid_level": "raid1", 00:21:24.319 "superblock": true, 00:21:24.319 "num_base_bdevs": 2, 00:21:24.319 "num_base_bdevs_discovered": 2, 00:21:24.319 "num_base_bdevs_operational": 2, 00:21:24.319 "base_bdevs_list": [ 00:21:24.319 { 00:21:24.319 "name": "spare", 00:21:24.319 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:24.319 "is_configured": true, 00:21:24.319 "data_offset": 2048, 00:21:24.319 "data_size": 63488 00:21:24.319 }, 00:21:24.319 { 00:21:24.319 "name": "BaseBdev2", 00:21:24.319 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:24.319 "is_configured": true, 00:21:24.319 "data_offset": 2048, 00:21:24.319 "data_size": 63488 00:21:24.319 } 00:21:24.319 ] 00:21:24.319 }' 00:21:24.319 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:24.319 07:57:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:24.889 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:25.149 [2024-07-15 07:57:09.774136] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:25.149 [2024-07-15 07:57:09.774156] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:25.149 [2024-07-15 07:57:09.774199] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:25.149 [2024-07-15 07:57:09.774238] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:25.149 [2024-07-15 07:57:09.774244] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b39390 name raid_bdev1, state offline 00:21:25.149 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.149 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:25.409 07:57:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:25.668 /dev/nbd0 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:25.668 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:25.669 1+0 records in 00:21:25.669 1+0 records out 00:21:25.669 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023524 s, 17.4 MB/s 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:25.669 /dev/nbd1 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:25.669 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:25.929 1+0 records in 00:21:25.929 1+0 records out 00:21:25.929 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234346 s, 17.5 MB/s 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:25.929 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:26.189 07:57:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:26.450 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:26.710 [2024-07-15 07:57:11.267836] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:26.710 [2024-07-15 07:57:11.267869] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.710 [2024-07-15 07:57:11.267881] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b37d60 00:21:26.710 [2024-07-15 07:57:11.267888] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.710 [2024-07-15 07:57:11.269276] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.710 [2024-07-15 07:57:11.269298] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:26.710 [2024-07-15 07:57:11.269362] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:26.710 [2024-07-15 07:57:11.269383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:26.710 [2024-07-15 07:57:11.269464] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:26.710 spare 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.710 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:26.710 [2024-07-15 07:57:11.369754] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1995920 00:21:26.710 [2024-07-15 07:57:11.369767] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:26.710 [2024-07-15 07:57:11.369930] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b42150 00:21:26.710 [2024-07-15 07:57:11.370046] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1995920 00:21:26.710 [2024-07-15 07:57:11.370052] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1995920 00:21:26.710 [2024-07-15 07:57:11.370129] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:26.971 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:26.971 "name": "raid_bdev1", 00:21:26.971 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:26.971 "strip_size_kb": 0, 00:21:26.971 "state": "online", 00:21:26.971 "raid_level": "raid1", 00:21:26.971 "superblock": true, 00:21:26.971 "num_base_bdevs": 2, 00:21:26.971 "num_base_bdevs_discovered": 2, 00:21:26.971 "num_base_bdevs_operational": 2, 00:21:26.971 "base_bdevs_list": [ 00:21:26.971 { 00:21:26.971 "name": "spare", 00:21:26.971 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:26.971 "is_configured": true, 00:21:26.971 "data_offset": 2048, 00:21:26.971 "data_size": 63488 00:21:26.971 }, 00:21:26.971 { 00:21:26.971 "name": "BaseBdev2", 00:21:26.971 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:26.971 "is_configured": true, 00:21:26.971 "data_offset": 2048, 00:21:26.971 "data_size": 63488 00:21:26.971 } 00:21:26.971 ] 00:21:26.971 }' 00:21:26.971 07:57:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:26.971 07:57:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:27.542 "name": "raid_bdev1", 00:21:27.542 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:27.542 "strip_size_kb": 0, 00:21:27.542 "state": "online", 00:21:27.542 "raid_level": "raid1", 00:21:27.542 "superblock": true, 00:21:27.542 "num_base_bdevs": 2, 00:21:27.542 "num_base_bdevs_discovered": 2, 00:21:27.542 "num_base_bdevs_operational": 2, 00:21:27.542 "base_bdevs_list": [ 00:21:27.542 { 00:21:27.542 "name": "spare", 00:21:27.542 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:27.542 "is_configured": true, 00:21:27.542 "data_offset": 2048, 00:21:27.542 "data_size": 63488 00:21:27.542 }, 00:21:27.542 { 00:21:27.542 "name": "BaseBdev2", 00:21:27.542 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:27.542 "is_configured": true, 00:21:27.542 "data_offset": 2048, 00:21:27.542 "data_size": 63488 00:21:27.542 } 00:21:27.542 ] 00:21:27.542 }' 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:27.542 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:27.802 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:27.802 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.802 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:27.802 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:27.802 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:28.062 [2024-07-15 07:57:12.687616] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.062 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.322 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.322 "name": "raid_bdev1", 00:21:28.322 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:28.322 "strip_size_kb": 0, 00:21:28.322 "state": "online", 00:21:28.322 "raid_level": "raid1", 00:21:28.322 "superblock": true, 00:21:28.322 "num_base_bdevs": 2, 00:21:28.322 "num_base_bdevs_discovered": 1, 00:21:28.322 "num_base_bdevs_operational": 1, 00:21:28.322 "base_bdevs_list": [ 00:21:28.322 { 00:21:28.322 "name": null, 00:21:28.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.322 "is_configured": false, 00:21:28.322 "data_offset": 2048, 00:21:28.322 "data_size": 63488 00:21:28.322 }, 00:21:28.322 { 00:21:28.322 "name": "BaseBdev2", 00:21:28.322 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:28.322 "is_configured": true, 00:21:28.322 "data_offset": 2048, 00:21:28.322 "data_size": 63488 00:21:28.322 } 00:21:28.322 ] 00:21:28.322 }' 00:21:28.322 07:57:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.322 07:57:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:28.892 07:57:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:28.892 [2024-07-15 07:57:13.565847] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:28.892 [2024-07-15 07:57:13.565956] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:28.892 [2024-07-15 07:57:13.565965] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:28.892 [2024-07-15 07:57:13.565984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:28.892 [2024-07-15 07:57:13.569340] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1996900 00:21:28.892 [2024-07-15 07:57:13.570937] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:28.892 07:57:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:29.831 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:29.831 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:29.831 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:29.831 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:29.831 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:30.092 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.092 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.092 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:30.092 "name": "raid_bdev1", 00:21:30.092 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:30.092 "strip_size_kb": 0, 00:21:30.092 "state": "online", 00:21:30.092 "raid_level": "raid1", 00:21:30.092 "superblock": true, 00:21:30.092 "num_base_bdevs": 2, 00:21:30.092 "num_base_bdevs_discovered": 2, 00:21:30.092 "num_base_bdevs_operational": 2, 00:21:30.092 "process": { 00:21:30.092 "type": "rebuild", 00:21:30.092 "target": "spare", 00:21:30.092 "progress": { 00:21:30.092 "blocks": 22528, 00:21:30.092 "percent": 35 00:21:30.092 } 00:21:30.092 }, 00:21:30.092 "base_bdevs_list": [ 00:21:30.092 { 00:21:30.092 "name": "spare", 00:21:30.092 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:30.092 "is_configured": true, 00:21:30.092 "data_offset": 2048, 00:21:30.092 "data_size": 63488 00:21:30.092 }, 00:21:30.092 { 00:21:30.092 "name": "BaseBdev2", 00:21:30.092 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:30.092 "is_configured": true, 00:21:30.092 "data_offset": 2048, 00:21:30.092 "data_size": 63488 00:21:30.092 } 00:21:30.092 ] 00:21:30.092 }' 00:21:30.092 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:30.092 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:30.092 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:30.353 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:30.353 07:57:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:30.353 [2024-07-15 07:57:15.047762] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:30.353 [2024-07-15 07:57:15.079795] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:30.353 [2024-07-15 07:57:15.079828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:30.353 [2024-07-15 07:57:15.079838] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:30.353 [2024-07-15 07:57:15.079842] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.353 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.613 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.613 "name": "raid_bdev1", 00:21:30.613 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:30.613 "strip_size_kb": 0, 00:21:30.613 "state": "online", 00:21:30.613 "raid_level": "raid1", 00:21:30.613 "superblock": true, 00:21:30.613 "num_base_bdevs": 2, 00:21:30.613 "num_base_bdevs_discovered": 1, 00:21:30.613 "num_base_bdevs_operational": 1, 00:21:30.613 "base_bdevs_list": [ 00:21:30.613 { 00:21:30.613 "name": null, 00:21:30.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.613 "is_configured": false, 00:21:30.613 "data_offset": 2048, 00:21:30.613 "data_size": 63488 00:21:30.613 }, 00:21:30.613 { 00:21:30.613 "name": "BaseBdev2", 00:21:30.613 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:30.613 "is_configured": true, 00:21:30.613 "data_offset": 2048, 00:21:30.613 "data_size": 63488 00:21:30.613 } 00:21:30.613 ] 00:21:30.613 }' 00:21:30.613 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.613 07:57:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:31.184 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:31.184 [2024-07-15 07:57:15.933960] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:31.184 [2024-07-15 07:57:15.933994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.184 [2024-07-15 07:57:15.934008] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1999410 00:21:31.184 [2024-07-15 07:57:15.934014] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.184 [2024-07-15 07:57:15.934326] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.185 [2024-07-15 07:57:15.934340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:31.185 [2024-07-15 07:57:15.934398] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:31.185 [2024-07-15 07:57:15.934406] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:31.185 [2024-07-15 07:57:15.934412] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:31.185 [2024-07-15 07:57:15.934423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:31.185 [2024-07-15 07:57:15.937735] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19958f0 00:21:31.185 [2024-07-15 07:57:15.938883] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:31.185 spare 00:21:31.445 07:57:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:32.384 07:57:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:32.384 07:57:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:32.384 07:57:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:32.384 07:57:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:32.384 07:57:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:32.384 07:57:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.384 07:57:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.644 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:32.644 "name": "raid_bdev1", 00:21:32.644 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:32.644 "strip_size_kb": 0, 00:21:32.644 "state": "online", 00:21:32.644 "raid_level": "raid1", 00:21:32.644 "superblock": true, 00:21:32.644 "num_base_bdevs": 2, 00:21:32.644 "num_base_bdevs_discovered": 2, 00:21:32.644 "num_base_bdevs_operational": 2, 00:21:32.644 "process": { 00:21:32.644 "type": "rebuild", 00:21:32.644 "target": "spare", 00:21:32.644 "progress": { 00:21:32.644 "blocks": 22528, 00:21:32.644 "percent": 35 00:21:32.644 } 00:21:32.644 }, 00:21:32.644 "base_bdevs_list": [ 00:21:32.644 { 00:21:32.644 "name": "spare", 00:21:32.644 "uuid": "af6ccc2a-224c-539e-ba50-a8753a2a8c20", 00:21:32.644 "is_configured": true, 00:21:32.644 "data_offset": 2048, 00:21:32.644 "data_size": 63488 00:21:32.644 }, 00:21:32.644 { 00:21:32.644 "name": "BaseBdev2", 00:21:32.644 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:32.644 "is_configured": true, 00:21:32.644 "data_offset": 2048, 00:21:32.644 "data_size": 63488 00:21:32.644 } 00:21:32.644 ] 00:21:32.644 }' 00:21:32.644 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:32.644 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:32.644 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:32.644 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:32.644 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:32.644 [2024-07-15 07:57:17.391569] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:32.903 [2024-07-15 07:57:17.447673] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:32.903 [2024-07-15 07:57:17.447716] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.903 [2024-07-15 07:57:17.447726] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:32.903 [2024-07-15 07:57:17.447730] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.903 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:33.162 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.162 "name": "raid_bdev1", 00:21:33.162 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:33.162 "strip_size_kb": 0, 00:21:33.162 "state": "online", 00:21:33.162 "raid_level": "raid1", 00:21:33.162 "superblock": true, 00:21:33.162 "num_base_bdevs": 2, 00:21:33.162 "num_base_bdevs_discovered": 1, 00:21:33.162 "num_base_bdevs_operational": 1, 00:21:33.162 "base_bdevs_list": [ 00:21:33.162 { 00:21:33.162 "name": null, 00:21:33.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.162 "is_configured": false, 00:21:33.162 "data_offset": 2048, 00:21:33.162 "data_size": 63488 00:21:33.162 }, 00:21:33.162 { 00:21:33.162 "name": "BaseBdev2", 00:21:33.162 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:33.162 "is_configured": true, 00:21:33.162 "data_offset": 2048, 00:21:33.162 "data_size": 63488 00:21:33.162 } 00:21:33.162 ] 00:21:33.162 }' 00:21:33.162 07:57:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.162 07:57:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:33.422 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:33.422 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:33.422 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:33.422 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:33.422 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:33.422 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.422 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:33.682 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:33.682 "name": "raid_bdev1", 00:21:33.682 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:33.682 "strip_size_kb": 0, 00:21:33.682 "state": "online", 00:21:33.682 "raid_level": "raid1", 00:21:33.682 "superblock": true, 00:21:33.682 "num_base_bdevs": 2, 00:21:33.682 "num_base_bdevs_discovered": 1, 00:21:33.682 "num_base_bdevs_operational": 1, 00:21:33.682 "base_bdevs_list": [ 00:21:33.682 { 00:21:33.682 "name": null, 00:21:33.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.682 "is_configured": false, 00:21:33.682 "data_offset": 2048, 00:21:33.682 "data_size": 63488 00:21:33.682 }, 00:21:33.682 { 00:21:33.682 "name": "BaseBdev2", 00:21:33.682 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:33.682 "is_configured": true, 00:21:33.682 "data_offset": 2048, 00:21:33.682 "data_size": 63488 00:21:33.682 } 00:21:33.682 ] 00:21:33.682 }' 00:21:33.682 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:33.682 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:33.682 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:33.941 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:33.941 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:33.941 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:34.201 [2024-07-15 07:57:18.791104] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:34.201 [2024-07-15 07:57:18.791136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:34.201 [2024-07-15 07:57:18.791148] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x199dcf0 00:21:34.201 [2024-07-15 07:57:18.791154] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:34.201 [2024-07-15 07:57:18.791441] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:34.201 [2024-07-15 07:57:18.791454] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:34.201 [2024-07-15 07:57:18.791500] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:34.201 [2024-07-15 07:57:18.791509] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:34.201 [2024-07-15 07:57:18.791514] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:34.201 BaseBdev1 00:21:34.201 07:57:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.139 07:57:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.398 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.398 "name": "raid_bdev1", 00:21:35.398 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:35.398 "strip_size_kb": 0, 00:21:35.398 "state": "online", 00:21:35.398 "raid_level": "raid1", 00:21:35.398 "superblock": true, 00:21:35.398 "num_base_bdevs": 2, 00:21:35.398 "num_base_bdevs_discovered": 1, 00:21:35.398 "num_base_bdevs_operational": 1, 00:21:35.398 "base_bdevs_list": [ 00:21:35.398 { 00:21:35.398 "name": null, 00:21:35.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.398 "is_configured": false, 00:21:35.398 "data_offset": 2048, 00:21:35.398 "data_size": 63488 00:21:35.398 }, 00:21:35.398 { 00:21:35.398 "name": "BaseBdev2", 00:21:35.398 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:35.398 "is_configured": true, 00:21:35.398 "data_offset": 2048, 00:21:35.398 "data_size": 63488 00:21:35.398 } 00:21:35.398 ] 00:21:35.398 }' 00:21:35.398 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.398 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:35.967 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:35.967 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:35.967 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:35.967 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:35.967 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:35.967 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.967 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:36.226 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:36.226 "name": "raid_bdev1", 00:21:36.226 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:36.226 "strip_size_kb": 0, 00:21:36.226 "state": "online", 00:21:36.226 "raid_level": "raid1", 00:21:36.226 "superblock": true, 00:21:36.226 "num_base_bdevs": 2, 00:21:36.226 "num_base_bdevs_discovered": 1, 00:21:36.226 "num_base_bdevs_operational": 1, 00:21:36.226 "base_bdevs_list": [ 00:21:36.226 { 00:21:36.226 "name": null, 00:21:36.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.226 "is_configured": false, 00:21:36.226 "data_offset": 2048, 00:21:36.226 "data_size": 63488 00:21:36.226 }, 00:21:36.226 { 00:21:36.226 "name": "BaseBdev2", 00:21:36.226 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:36.226 "is_configured": true, 00:21:36.226 "data_offset": 2048, 00:21:36.226 "data_size": 63488 00:21:36.226 } 00:21:36.226 ] 00:21:36.226 }' 00:21:36.226 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:36.227 07:57:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:36.486 [2024-07-15 07:57:21.021123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:36.486 [2024-07-15 07:57:21.021221] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:36.486 [2024-07-15 07:57:21.021234] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:36.486 request: 00:21:36.486 { 00:21:36.486 "base_bdev": "BaseBdev1", 00:21:36.486 "raid_bdev": "raid_bdev1", 00:21:36.486 "method": "bdev_raid_add_base_bdev", 00:21:36.486 "req_id": 1 00:21:36.486 } 00:21:36.486 Got JSON-RPC error response 00:21:36.486 response: 00:21:36.486 { 00:21:36.486 "code": -22, 00:21:36.486 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:36.486 } 00:21:36.486 07:57:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:21:36.486 07:57:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:36.486 07:57:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:36.486 07:57:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:36.487 07:57:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.469 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.728 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.728 "name": "raid_bdev1", 00:21:37.728 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:37.728 "strip_size_kb": 0, 00:21:37.728 "state": "online", 00:21:37.728 "raid_level": "raid1", 00:21:37.728 "superblock": true, 00:21:37.728 "num_base_bdevs": 2, 00:21:37.728 "num_base_bdevs_discovered": 1, 00:21:37.728 "num_base_bdevs_operational": 1, 00:21:37.728 "base_bdevs_list": [ 00:21:37.728 { 00:21:37.728 "name": null, 00:21:37.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.728 "is_configured": false, 00:21:37.728 "data_offset": 2048, 00:21:37.728 "data_size": 63488 00:21:37.728 }, 00:21:37.728 { 00:21:37.728 "name": "BaseBdev2", 00:21:37.728 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:37.728 "is_configured": true, 00:21:37.728 "data_offset": 2048, 00:21:37.728 "data_size": 63488 00:21:37.728 } 00:21:37.728 ] 00:21:37.728 }' 00:21:37.728 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.728 07:57:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:38.294 "name": "raid_bdev1", 00:21:38.294 "uuid": "d9f2dbc2-10e4-44c1-935a-f60e18159c2f", 00:21:38.294 "strip_size_kb": 0, 00:21:38.294 "state": "online", 00:21:38.294 "raid_level": "raid1", 00:21:38.294 "superblock": true, 00:21:38.294 "num_base_bdevs": 2, 00:21:38.294 "num_base_bdevs_discovered": 1, 00:21:38.294 "num_base_bdevs_operational": 1, 00:21:38.294 "base_bdevs_list": [ 00:21:38.294 { 00:21:38.294 "name": null, 00:21:38.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.294 "is_configured": false, 00:21:38.294 "data_offset": 2048, 00:21:38.294 "data_size": 63488 00:21:38.294 }, 00:21:38.294 { 00:21:38.294 "name": "BaseBdev2", 00:21:38.294 "uuid": "0ab943f7-730b-55bf-9f75-b35f9f35fd07", 00:21:38.294 "is_configured": true, 00:21:38.294 "data_offset": 2048, 00:21:38.294 "data_size": 63488 00:21:38.294 } 00:21:38.294 ] 00:21:38.294 }' 00:21:38.294 07:57:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:38.294 07:57:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:38.294 07:57:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1707389 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1707389 ']' 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1707389 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1707389 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1707389' 00:21:38.566 killing process with pid 1707389 00:21:38.566 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1707389 00:21:38.566 Received shutdown signal, test time was about 60.000000 seconds 00:21:38.566 00:21:38.566 Latency(us) 00:21:38.566 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:38.566 =================================================================================================================== 00:21:38.567 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:38.567 [2024-07-15 07:57:23.096076] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:38.567 [2024-07-15 07:57:23.096146] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:38.567 [2024-07-15 07:57:23.096175] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:38.567 [2024-07-15 07:57:23.096182] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1995920 name raid_bdev1, state offline 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1707389 00:21:38.567 [2024-07-15 07:57:23.111141] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:21:38.567 00:21:38.567 real 0m33.019s 00:21:38.567 user 0m47.235s 00:21:38.567 sys 0m4.913s 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:38.567 ************************************ 00:21:38.567 END TEST raid_rebuild_test_sb 00:21:38.567 ************************************ 00:21:38.567 07:57:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:38.567 07:57:23 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:21:38.567 07:57:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:38.567 07:57:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:38.567 07:57:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:38.567 ************************************ 00:21:38.567 START TEST raid_rebuild_test_io 00:21:38.567 ************************************ 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:38.567 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:38.568 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:38.568 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:38.568 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:38.568 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:38.568 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:38.568 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:38.568 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1713924 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1713924 /var/tmp/spdk-raid.sock 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1713924 ']' 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:38.833 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:38.833 07:57:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:38.833 [2024-07-15 07:57:23.375271] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:21:38.833 [2024-07-15 07:57:23.375318] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713924 ] 00:21:38.833 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:38.833 Zero copy mechanism will not be used. 00:21:38.833 [2024-07-15 07:57:23.464588] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:38.833 [2024-07-15 07:57:23.539644] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:38.833 [2024-07-15 07:57:23.578954] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:38.833 [2024-07-15 07:57:23.578979] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:39.771 07:57:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:39.771 07:57:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:21:39.771 07:57:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:39.771 07:57:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:39.771 BaseBdev1_malloc 00:21:39.771 07:57:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:40.031 [2024-07-15 07:57:24.533764] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:40.031 [2024-07-15 07:57:24.533796] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:40.031 [2024-07-15 07:57:24.533809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x86ed30 00:21:40.031 [2024-07-15 07:57:24.533816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:40.031 [2024-07-15 07:57:24.535098] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:40.031 [2024-07-15 07:57:24.535119] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:40.031 BaseBdev1 00:21:40.031 07:57:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:40.031 07:57:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:40.031 BaseBdev2_malloc 00:21:40.031 07:57:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:40.290 [2024-07-15 07:57:24.904719] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:40.290 [2024-07-15 07:57:24.904747] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:40.290 [2024-07-15 07:57:24.904758] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa21c60 00:21:40.290 [2024-07-15 07:57:24.904764] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:40.290 [2024-07-15 07:57:24.905952] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:40.290 [2024-07-15 07:57:24.905971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:40.290 BaseBdev2 00:21:40.290 07:57:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:40.550 spare_malloc 00:21:40.550 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:40.550 spare_delay 00:21:40.550 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:40.810 [2024-07-15 07:57:25.448018] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:40.810 [2024-07-15 07:57:25.448048] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:40.810 [2024-07-15 07:57:25.448060] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa11ec0 00:21:40.810 [2024-07-15 07:57:25.448066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:40.810 [2024-07-15 07:57:25.449255] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:40.810 [2024-07-15 07:57:25.449275] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:40.810 spare 00:21:40.810 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:41.070 [2024-07-15 07:57:25.640516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:41.070 [2024-07-15 07:57:25.641506] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:41.070 [2024-07-15 07:57:25.641561] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa09390 00:21:41.070 [2024-07-15 07:57:25.641567] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:41.070 [2024-07-15 07:57:25.641724] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8697f0 00:21:41.070 [2024-07-15 07:57:25.641835] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa09390 00:21:41.070 [2024-07-15 07:57:25.641841] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa09390 00:21:41.070 [2024-07-15 07:57:25.641920] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.070 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.331 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.331 "name": "raid_bdev1", 00:21:41.331 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:41.331 "strip_size_kb": 0, 00:21:41.331 "state": "online", 00:21:41.331 "raid_level": "raid1", 00:21:41.331 "superblock": false, 00:21:41.331 "num_base_bdevs": 2, 00:21:41.331 "num_base_bdevs_discovered": 2, 00:21:41.331 "num_base_bdevs_operational": 2, 00:21:41.331 "base_bdevs_list": [ 00:21:41.331 { 00:21:41.331 "name": "BaseBdev1", 00:21:41.331 "uuid": "420d4e26-064b-593f-ac0a-e1fc879edceb", 00:21:41.331 "is_configured": true, 00:21:41.331 "data_offset": 0, 00:21:41.331 "data_size": 65536 00:21:41.331 }, 00:21:41.331 { 00:21:41.331 "name": "BaseBdev2", 00:21:41.331 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:41.331 "is_configured": true, 00:21:41.331 "data_offset": 0, 00:21:41.331 "data_size": 65536 00:21:41.331 } 00:21:41.331 ] 00:21:41.331 }' 00:21:41.331 07:57:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.331 07:57:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:41.902 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:41.902 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:41.902 [2024-07-15 07:57:26.559028] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:41.902 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:21:41.902 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.902 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:42.162 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:21:42.162 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:21:42.162 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:42.162 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:42.162 [2024-07-15 07:57:26.865057] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8670d0 00:21:42.162 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:42.162 Zero copy mechanism will not be used. 00:21:42.162 Running I/O for 60 seconds... 00:21:42.422 [2024-07-15 07:57:26.949706] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:42.422 [2024-07-15 07:57:26.956256] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x8670d0 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.422 07:57:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.682 07:57:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.682 "name": "raid_bdev1", 00:21:42.682 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:42.682 "strip_size_kb": 0, 00:21:42.682 "state": "online", 00:21:42.682 "raid_level": "raid1", 00:21:42.682 "superblock": false, 00:21:42.682 "num_base_bdevs": 2, 00:21:42.682 "num_base_bdevs_discovered": 1, 00:21:42.682 "num_base_bdevs_operational": 1, 00:21:42.682 "base_bdevs_list": [ 00:21:42.682 { 00:21:42.682 "name": null, 00:21:42.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.682 "is_configured": false, 00:21:42.682 "data_offset": 0, 00:21:42.682 "data_size": 65536 00:21:42.682 }, 00:21:42.682 { 00:21:42.682 "name": "BaseBdev2", 00:21:42.682 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:42.682 "is_configured": true, 00:21:42.682 "data_offset": 0, 00:21:42.682 "data_size": 65536 00:21:42.682 } 00:21:42.682 ] 00:21:42.682 }' 00:21:42.682 07:57:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.682 07:57:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:43.252 07:57:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:43.252 [2024-07-15 07:57:27.921724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:43.252 07:57:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:43.252 [2024-07-15 07:57:27.961032] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x574b30 00:21:43.252 [2024-07-15 07:57:27.962641] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:43.513 [2024-07-15 07:57:28.070403] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:43.513 [2024-07-15 07:57:28.070637] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:43.773 [2024-07-15 07:57:28.272301] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:43.773 [2024-07-15 07:57:28.272431] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:44.033 [2024-07-15 07:57:28.589497] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:44.292 [2024-07-15 07:57:28.798589] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:44.292 [2024-07-15 07:57:28.798734] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:44.292 07:57:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:44.292 07:57:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:44.292 07:57:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:44.292 07:57:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:44.292 07:57:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:44.293 07:57:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.293 07:57:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:44.553 [2024-07-15 07:57:29.063680] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:44.554 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:44.554 "name": "raid_bdev1", 00:21:44.554 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:44.554 "strip_size_kb": 0, 00:21:44.554 "state": "online", 00:21:44.554 "raid_level": "raid1", 00:21:44.554 "superblock": false, 00:21:44.554 "num_base_bdevs": 2, 00:21:44.554 "num_base_bdevs_discovered": 2, 00:21:44.554 "num_base_bdevs_operational": 2, 00:21:44.554 "process": { 00:21:44.554 "type": "rebuild", 00:21:44.554 "target": "spare", 00:21:44.554 "progress": { 00:21:44.554 "blocks": 14336, 00:21:44.554 "percent": 21 00:21:44.554 } 00:21:44.554 }, 00:21:44.554 "base_bdevs_list": [ 00:21:44.554 { 00:21:44.554 "name": "spare", 00:21:44.554 "uuid": "e77f2174-5d86-5066-a9e6-73fdcd6e9f2f", 00:21:44.554 "is_configured": true, 00:21:44.554 "data_offset": 0, 00:21:44.554 "data_size": 65536 00:21:44.554 }, 00:21:44.554 { 00:21:44.554 "name": "BaseBdev2", 00:21:44.554 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:44.554 "is_configured": true, 00:21:44.554 "data_offset": 0, 00:21:44.554 "data_size": 65536 00:21:44.554 } 00:21:44.554 ] 00:21:44.554 }' 00:21:44.554 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:44.554 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:44.554 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:44.554 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:44.554 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:44.554 [2024-07-15 07:57:29.278192] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:44.554 [2024-07-15 07:57:29.278327] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:44.813 [2024-07-15 07:57:29.437126] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:45.072 [2024-07-15 07:57:29.608696] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:45.072 [2024-07-15 07:57:29.616401] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:45.072 [2024-07-15 07:57:29.616420] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:45.072 [2024-07-15 07:57:29.616425] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:45.072 [2024-07-15 07:57:29.620530] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x8670d0 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.072 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.332 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.332 "name": "raid_bdev1", 00:21:45.332 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:45.332 "strip_size_kb": 0, 00:21:45.332 "state": "online", 00:21:45.332 "raid_level": "raid1", 00:21:45.332 "superblock": false, 00:21:45.332 "num_base_bdevs": 2, 00:21:45.332 "num_base_bdevs_discovered": 1, 00:21:45.332 "num_base_bdevs_operational": 1, 00:21:45.332 "base_bdevs_list": [ 00:21:45.332 { 00:21:45.332 "name": null, 00:21:45.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.332 "is_configured": false, 00:21:45.332 "data_offset": 0, 00:21:45.332 "data_size": 65536 00:21:45.332 }, 00:21:45.332 { 00:21:45.332 "name": "BaseBdev2", 00:21:45.332 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:45.332 "is_configured": true, 00:21:45.332 "data_offset": 0, 00:21:45.332 "data_size": 65536 00:21:45.332 } 00:21:45.332 ] 00:21:45.332 }' 00:21:45.332 07:57:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.332 07:57:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:45.902 "name": "raid_bdev1", 00:21:45.902 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:45.902 "strip_size_kb": 0, 00:21:45.902 "state": "online", 00:21:45.902 "raid_level": "raid1", 00:21:45.902 "superblock": false, 00:21:45.902 "num_base_bdevs": 2, 00:21:45.902 "num_base_bdevs_discovered": 1, 00:21:45.902 "num_base_bdevs_operational": 1, 00:21:45.902 "base_bdevs_list": [ 00:21:45.902 { 00:21:45.902 "name": null, 00:21:45.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.902 "is_configured": false, 00:21:45.902 "data_offset": 0, 00:21:45.902 "data_size": 65536 00:21:45.902 }, 00:21:45.902 { 00:21:45.902 "name": "BaseBdev2", 00:21:45.902 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:45.902 "is_configured": true, 00:21:45.902 "data_offset": 0, 00:21:45.902 "data_size": 65536 00:21:45.902 } 00:21:45.902 ] 00:21:45.902 }' 00:21:45.902 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:46.163 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:46.164 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:46.164 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:46.164 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:46.164 [2024-07-15 07:57:30.900091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:46.423 [2024-07-15 07:57:30.946055] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x867180 00:21:46.423 [2024-07-15 07:57:30.947188] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:46.423 07:57:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:46.423 [2024-07-15 07:57:31.087772] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:46.684 [2024-07-15 07:57:31.215764] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:46.684 [2024-07-15 07:57:31.215877] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:46.944 [2024-07-15 07:57:31.546952] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:47.205 [2024-07-15 07:57:31.755676] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:47.205 07:57:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:47.205 07:57:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:47.205 07:57:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:47.205 07:57:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:47.205 07:57:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:47.205 07:57:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.205 07:57:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.467 [2024-07-15 07:57:32.026495] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:47.467 [2024-07-15 07:57:32.147558] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:47.467 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:47.467 "name": "raid_bdev1", 00:21:47.467 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:47.467 "strip_size_kb": 0, 00:21:47.467 "state": "online", 00:21:47.467 "raid_level": "raid1", 00:21:47.467 "superblock": false, 00:21:47.467 "num_base_bdevs": 2, 00:21:47.467 "num_base_bdevs_discovered": 2, 00:21:47.467 "num_base_bdevs_operational": 2, 00:21:47.467 "process": { 00:21:47.467 "type": "rebuild", 00:21:47.467 "target": "spare", 00:21:47.467 "progress": { 00:21:47.467 "blocks": 14336, 00:21:47.467 "percent": 21 00:21:47.467 } 00:21:47.467 }, 00:21:47.467 "base_bdevs_list": [ 00:21:47.467 { 00:21:47.467 "name": "spare", 00:21:47.467 "uuid": "e77f2174-5d86-5066-a9e6-73fdcd6e9f2f", 00:21:47.467 "is_configured": true, 00:21:47.467 "data_offset": 0, 00:21:47.467 "data_size": 65536 00:21:47.467 }, 00:21:47.467 { 00:21:47.467 "name": "BaseBdev2", 00:21:47.467 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:47.467 "is_configured": true, 00:21:47.467 "data_offset": 0, 00:21:47.467 "data_size": 65536 00:21:47.467 } 00:21:47.467 ] 00:21:47.467 }' 00:21:47.467 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:47.467 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:47.467 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=718 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:47.729 "name": "raid_bdev1", 00:21:47.729 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:47.729 "strip_size_kb": 0, 00:21:47.729 "state": "online", 00:21:47.729 "raid_level": "raid1", 00:21:47.729 "superblock": false, 00:21:47.729 "num_base_bdevs": 2, 00:21:47.729 "num_base_bdevs_discovered": 2, 00:21:47.729 "num_base_bdevs_operational": 2, 00:21:47.729 "process": { 00:21:47.729 "type": "rebuild", 00:21:47.729 "target": "spare", 00:21:47.729 "progress": { 00:21:47.729 "blocks": 18432, 00:21:47.729 "percent": 28 00:21:47.729 } 00:21:47.729 }, 00:21:47.729 "base_bdevs_list": [ 00:21:47.729 { 00:21:47.729 "name": "spare", 00:21:47.729 "uuid": "e77f2174-5d86-5066-a9e6-73fdcd6e9f2f", 00:21:47.729 "is_configured": true, 00:21:47.729 "data_offset": 0, 00:21:47.729 "data_size": 65536 00:21:47.729 }, 00:21:47.729 { 00:21:47.729 "name": "BaseBdev2", 00:21:47.729 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:47.729 "is_configured": true, 00:21:47.729 "data_offset": 0, 00:21:47.729 "data_size": 65536 00:21:47.729 } 00:21:47.729 ] 00:21:47.729 }' 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:47.729 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:47.989 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:47.989 07:57:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:47.989 [2024-07-15 07:57:32.527469] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:21:48.250 [2024-07-15 07:57:32.749678] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:48.250 [2024-07-15 07:57:32.749817] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:48.250 [2024-07-15 07:57:32.973493] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:48.250 [2024-07-15 07:57:32.973697] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:21:48.511 [2024-07-15 07:57:33.182354] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:21:48.511 [2024-07-15 07:57:33.182455] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:21:48.771 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:48.771 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:48.771 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:48.771 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:48.771 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:48.771 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:48.771 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.771 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.031 [2024-07-15 07:57:33.535498] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:21:49.031 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:49.031 "name": "raid_bdev1", 00:21:49.031 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:49.031 "strip_size_kb": 0, 00:21:49.031 "state": "online", 00:21:49.031 "raid_level": "raid1", 00:21:49.031 "superblock": false, 00:21:49.031 "num_base_bdevs": 2, 00:21:49.031 "num_base_bdevs_discovered": 2, 00:21:49.031 "num_base_bdevs_operational": 2, 00:21:49.031 "process": { 00:21:49.031 "type": "rebuild", 00:21:49.031 "target": "spare", 00:21:49.031 "progress": { 00:21:49.031 "blocks": 36864, 00:21:49.031 "percent": 56 00:21:49.031 } 00:21:49.031 }, 00:21:49.031 "base_bdevs_list": [ 00:21:49.031 { 00:21:49.031 "name": "spare", 00:21:49.031 "uuid": "e77f2174-5d86-5066-a9e6-73fdcd6e9f2f", 00:21:49.031 "is_configured": true, 00:21:49.031 "data_offset": 0, 00:21:49.031 "data_size": 65536 00:21:49.031 }, 00:21:49.031 { 00:21:49.031 "name": "BaseBdev2", 00:21:49.031 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:49.031 "is_configured": true, 00:21:49.031 "data_offset": 0, 00:21:49.031 "data_size": 65536 00:21:49.031 } 00:21:49.031 ] 00:21:49.031 }' 00:21:49.031 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:49.031 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:49.031 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:49.031 [2024-07-15 07:57:33.772340] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:21:49.291 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:49.291 07:57:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:49.291 [2024-07-15 07:57:34.007260] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:49.291 [2024-07-15 07:57:34.007409] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:21:49.862 [2024-07-15 07:57:34.452743] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:21:50.123 [2024-07-15 07:57:34.810353] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:21:50.123 07:57:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:50.123 07:57:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:50.123 07:57:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:50.123 07:57:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:50.123 07:57:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:50.123 07:57:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:50.123 07:57:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.123 07:57:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.384 07:57:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:50.384 "name": "raid_bdev1", 00:21:50.384 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:50.384 "strip_size_kb": 0, 00:21:50.384 "state": "online", 00:21:50.384 "raid_level": "raid1", 00:21:50.384 "superblock": false, 00:21:50.384 "num_base_bdevs": 2, 00:21:50.384 "num_base_bdevs_discovered": 2, 00:21:50.384 "num_base_bdevs_operational": 2, 00:21:50.384 "process": { 00:21:50.384 "type": "rebuild", 00:21:50.384 "target": "spare", 00:21:50.384 "progress": { 00:21:50.384 "blocks": 51200, 00:21:50.384 "percent": 78 00:21:50.384 } 00:21:50.384 }, 00:21:50.384 "base_bdevs_list": [ 00:21:50.384 { 00:21:50.384 "name": "spare", 00:21:50.384 "uuid": "e77f2174-5d86-5066-a9e6-73fdcd6e9f2f", 00:21:50.384 "is_configured": true, 00:21:50.384 "data_offset": 0, 00:21:50.384 "data_size": 65536 00:21:50.384 }, 00:21:50.384 { 00:21:50.384 "name": "BaseBdev2", 00:21:50.384 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:50.384 "is_configured": true, 00:21:50.384 "data_offset": 0, 00:21:50.384 "data_size": 65536 00:21:50.384 } 00:21:50.384 ] 00:21:50.384 }' 00:21:50.384 07:57:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:50.384 07:57:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:50.384 07:57:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:50.384 07:57:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:50.384 07:57:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:50.645 [2024-07-15 07:57:35.370062] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:21:51.215 [2024-07-15 07:57:35.794597] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:51.215 [2024-07-15 07:57:35.893594] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:51.215 [2024-07-15 07:57:35.894618] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.515 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:51.515 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:51.515 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:51.515 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:51.515 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:51.515 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:51.515 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.515 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:51.777 "name": "raid_bdev1", 00:21:51.777 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:51.777 "strip_size_kb": 0, 00:21:51.777 "state": "online", 00:21:51.777 "raid_level": "raid1", 00:21:51.777 "superblock": false, 00:21:51.777 "num_base_bdevs": 2, 00:21:51.777 "num_base_bdevs_discovered": 2, 00:21:51.777 "num_base_bdevs_operational": 2, 00:21:51.777 "base_bdevs_list": [ 00:21:51.777 { 00:21:51.777 "name": "spare", 00:21:51.777 "uuid": "e77f2174-5d86-5066-a9e6-73fdcd6e9f2f", 00:21:51.777 "is_configured": true, 00:21:51.777 "data_offset": 0, 00:21:51.777 "data_size": 65536 00:21:51.777 }, 00:21:51.777 { 00:21:51.777 "name": "BaseBdev2", 00:21:51.777 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:51.777 "is_configured": true, 00:21:51.777 "data_offset": 0, 00:21:51.777 "data_size": 65536 00:21:51.777 } 00:21:51.777 ] 00:21:51.777 }' 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.777 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.037 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:52.037 "name": "raid_bdev1", 00:21:52.037 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:52.037 "strip_size_kb": 0, 00:21:52.037 "state": "online", 00:21:52.037 "raid_level": "raid1", 00:21:52.038 "superblock": false, 00:21:52.038 "num_base_bdevs": 2, 00:21:52.038 "num_base_bdevs_discovered": 2, 00:21:52.038 "num_base_bdevs_operational": 2, 00:21:52.038 "base_bdevs_list": [ 00:21:52.038 { 00:21:52.038 "name": "spare", 00:21:52.038 "uuid": "e77f2174-5d86-5066-a9e6-73fdcd6e9f2f", 00:21:52.038 "is_configured": true, 00:21:52.038 "data_offset": 0, 00:21:52.038 "data_size": 65536 00:21:52.038 }, 00:21:52.038 { 00:21:52.038 "name": "BaseBdev2", 00:21:52.038 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:52.038 "is_configured": true, 00:21:52.038 "data_offset": 0, 00:21:52.038 "data_size": 65536 00:21:52.038 } 00:21:52.038 ] 00:21:52.038 }' 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.038 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.298 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.298 "name": "raid_bdev1", 00:21:52.298 "uuid": "99b26dc0-0a8e-4676-be05-af4e51184d7c", 00:21:52.298 "strip_size_kb": 0, 00:21:52.298 "state": "online", 00:21:52.298 "raid_level": "raid1", 00:21:52.298 "superblock": false, 00:21:52.298 "num_base_bdevs": 2, 00:21:52.298 "num_base_bdevs_discovered": 2, 00:21:52.298 "num_base_bdevs_operational": 2, 00:21:52.298 "base_bdevs_list": [ 00:21:52.298 { 00:21:52.298 "name": "spare", 00:21:52.298 "uuid": "e77f2174-5d86-5066-a9e6-73fdcd6e9f2f", 00:21:52.298 "is_configured": true, 00:21:52.298 "data_offset": 0, 00:21:52.298 "data_size": 65536 00:21:52.298 }, 00:21:52.298 { 00:21:52.298 "name": "BaseBdev2", 00:21:52.298 "uuid": "07793add-ea7a-5fc1-9ef2-be7a65295a75", 00:21:52.298 "is_configured": true, 00:21:52.298 "data_offset": 0, 00:21:52.298 "data_size": 65536 00:21:52.298 } 00:21:52.298 ] 00:21:52.298 }' 00:21:52.298 07:57:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.298 07:57:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:52.870 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:52.870 [2024-07-15 07:57:37.593251] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:52.870 [2024-07-15 07:57:37.593273] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:52.870 00:21:52.870 Latency(us) 00:21:52.870 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:52.870 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:52.870 raid_bdev1 : 10.73 108.02 324.06 0.00 0.00 12458.82 244.18 114536.76 00:21:52.870 =================================================================================================================== 00:21:52.870 Total : 108.02 324.06 0.00 0.00 12458.82 244.18 114536.76 00:21:52.870 [2024-07-15 07:57:37.624537] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:52.870 [2024-07-15 07:57:37.624560] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:52.870 [2024-07-15 07:57:37.624616] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:52.871 [2024-07-15 07:57:37.624622] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa09390 name raid_bdev1, state offline 00:21:52.871 0 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:53.131 07:57:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:53.391 /dev/nbd0 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:53.391 1+0 records in 00:21:53.391 1+0 records out 00:21:53.391 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279747 s, 14.6 MB/s 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:53.391 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:21:53.652 /dev/nbd1 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:53.652 1+0 records in 00:21:53.652 1+0 records out 00:21:53.652 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250228 s, 16.4 MB/s 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:53.652 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:53.913 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1713924 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1713924 ']' 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1713924 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1713924 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1713924' 00:21:54.173 killing process with pid 1713924 00:21:54.173 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1713924 00:21:54.173 Received shutdown signal, test time was about 11.913129 seconds 00:21:54.173 00:21:54.174 Latency(us) 00:21:54.174 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:54.174 =================================================================================================================== 00:21:54.174 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:54.174 [2024-07-15 07:57:38.807835] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:54.174 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1713924 00:21:54.174 [2024-07-15 07:57:38.819009] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:54.435 07:57:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:21:54.435 00:21:54.435 real 0m15.632s 00:21:54.435 user 0m23.874s 00:21:54.435 sys 0m1.894s 00:21:54.435 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:54.435 07:57:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:54.435 ************************************ 00:21:54.435 END TEST raid_rebuild_test_io 00:21:54.435 ************************************ 00:21:54.435 07:57:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:54.435 07:57:38 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:21:54.435 07:57:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:21:54.435 07:57:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:54.435 07:57:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:54.435 ************************************ 00:21:54.435 START TEST raid_rebuild_test_sb_io 00:21:54.435 ************************************ 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1716636 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1716636 /var/tmp/spdk-raid.sock 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1716636 ']' 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:54.435 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:54.435 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:54.435 [2024-07-15 07:57:39.088239] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:21:54.435 [2024-07-15 07:57:39.088297] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1716636 ] 00:21:54.435 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:54.435 Zero copy mechanism will not be used. 00:21:54.435 [2024-07-15 07:57:39.177244] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:54.696 [2024-07-15 07:57:39.245136] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:54.696 [2024-07-15 07:57:39.296433] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:54.696 [2024-07-15 07:57:39.296458] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:55.268 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:55.268 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:21:55.268 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:55.268 07:57:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:55.528 BaseBdev1_malloc 00:21:55.528 07:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:55.528 [2024-07-15 07:57:40.267211] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:55.528 [2024-07-15 07:57:40.267244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:55.528 [2024-07-15 07:57:40.267257] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2507d30 00:21:55.528 [2024-07-15 07:57:40.267264] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:55.528 [2024-07-15 07:57:40.268560] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:55.528 [2024-07-15 07:57:40.268581] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:55.528 BaseBdev1 00:21:55.528 07:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:55.528 07:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:55.788 BaseBdev2_malloc 00:21:55.788 07:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:56.049 [2024-07-15 07:57:40.654209] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:56.049 [2024-07-15 07:57:40.654238] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.049 [2024-07-15 07:57:40.654250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26bac60 00:21:56.049 [2024-07-15 07:57:40.654256] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.049 [2024-07-15 07:57:40.655466] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.049 [2024-07-15 07:57:40.655484] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:56.049 BaseBdev2 00:21:56.049 07:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:56.309 spare_malloc 00:21:56.309 07:57:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:56.309 spare_delay 00:21:56.309 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:56.570 [2024-07-15 07:57:41.221528] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:56.570 [2024-07-15 07:57:41.221559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.570 [2024-07-15 07:57:41.221571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26aaec0 00:21:56.570 [2024-07-15 07:57:41.221578] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.570 [2024-07-15 07:57:41.222791] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.570 [2024-07-15 07:57:41.222810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:56.570 spare 00:21:56.570 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:56.830 [2024-07-15 07:57:41.410026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:56.830 [2024-07-15 07:57:41.411029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:56.830 [2024-07-15 07:57:41.411142] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26a2390 00:21:56.830 [2024-07-15 07:57:41.411150] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:56.830 [2024-07-15 07:57:41.411296] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24fe7c0 00:21:56.830 [2024-07-15 07:57:41.411405] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26a2390 00:21:56.830 [2024-07-15 07:57:41.411411] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26a2390 00:21:56.830 [2024-07-15 07:57:41.411476] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.831 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.092 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.092 "name": "raid_bdev1", 00:21:57.092 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:21:57.092 "strip_size_kb": 0, 00:21:57.092 "state": "online", 00:21:57.092 "raid_level": "raid1", 00:21:57.092 "superblock": true, 00:21:57.092 "num_base_bdevs": 2, 00:21:57.092 "num_base_bdevs_discovered": 2, 00:21:57.092 "num_base_bdevs_operational": 2, 00:21:57.092 "base_bdevs_list": [ 00:21:57.092 { 00:21:57.092 "name": "BaseBdev1", 00:21:57.092 "uuid": "9016f788-6152-5bf7-81a7-70cec1259aa8", 00:21:57.092 "is_configured": true, 00:21:57.092 "data_offset": 2048, 00:21:57.092 "data_size": 63488 00:21:57.092 }, 00:21:57.092 { 00:21:57.092 "name": "BaseBdev2", 00:21:57.092 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:21:57.092 "is_configured": true, 00:21:57.092 "data_offset": 2048, 00:21:57.092 "data_size": 63488 00:21:57.092 } 00:21:57.092 ] 00:21:57.092 }' 00:21:57.092 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.092 07:57:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:57.663 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:57.663 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:57.663 [2024-07-15 07:57:42.292427] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:57.663 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:21:57.663 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.663 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:57.924 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:21:57.924 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:21:57.924 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:57.924 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:57.924 [2024-07-15 07:57:42.586391] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a1260 00:21:57.924 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:57.924 Zero copy mechanism will not be used. 00:21:57.924 Running I/O for 60 seconds... 00:21:58.184 [2024-07-15 07:57:42.694034] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:58.185 [2024-07-15 07:57:42.707058] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x26a1260 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.185 "name": "raid_bdev1", 00:21:58.185 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:21:58.185 "strip_size_kb": 0, 00:21:58.185 "state": "online", 00:21:58.185 "raid_level": "raid1", 00:21:58.185 "superblock": true, 00:21:58.185 "num_base_bdevs": 2, 00:21:58.185 "num_base_bdevs_discovered": 1, 00:21:58.185 "num_base_bdevs_operational": 1, 00:21:58.185 "base_bdevs_list": [ 00:21:58.185 { 00:21:58.185 "name": null, 00:21:58.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.185 "is_configured": false, 00:21:58.185 "data_offset": 2048, 00:21:58.185 "data_size": 63488 00:21:58.185 }, 00:21:58.185 { 00:21:58.185 "name": "BaseBdev2", 00:21:58.185 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:21:58.185 "is_configured": true, 00:21:58.185 "data_offset": 2048, 00:21:58.185 "data_size": 63488 00:21:58.185 } 00:21:58.185 ] 00:21:58.185 }' 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.185 07:57:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:58.755 07:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:59.014 [2024-07-15 07:57:43.634162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:59.014 07:57:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:59.014 [2024-07-15 07:57:43.673649] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2518d10 00:21:59.014 [2024-07-15 07:57:43.675232] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:59.274 [2024-07-15 07:57:43.789912] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:59.274 [2024-07-15 07:57:43.790142] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:59.274 [2024-07-15 07:57:44.011772] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:59.274 [2024-07-15 07:57:44.011885] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:59.843 [2024-07-15 07:57:44.471298] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:00.103 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:00.103 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:00.103 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:00.103 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:00.103 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:00.103 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.103 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.103 [2024-07-15 07:57:44.821086] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:00.103 [2024-07-15 07:57:44.821235] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:00.363 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:00.363 "name": "raid_bdev1", 00:22:00.363 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:00.363 "strip_size_kb": 0, 00:22:00.363 "state": "online", 00:22:00.363 "raid_level": "raid1", 00:22:00.363 "superblock": true, 00:22:00.363 "num_base_bdevs": 2, 00:22:00.363 "num_base_bdevs_discovered": 2, 00:22:00.363 "num_base_bdevs_operational": 2, 00:22:00.363 "process": { 00:22:00.363 "type": "rebuild", 00:22:00.363 "target": "spare", 00:22:00.363 "progress": { 00:22:00.363 "blocks": 16384, 00:22:00.363 "percent": 25 00:22:00.363 } 00:22:00.363 }, 00:22:00.363 "base_bdevs_list": [ 00:22:00.363 { 00:22:00.363 "name": "spare", 00:22:00.363 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:00.363 "is_configured": true, 00:22:00.363 "data_offset": 2048, 00:22:00.363 "data_size": 63488 00:22:00.363 }, 00:22:00.363 { 00:22:00.363 "name": "BaseBdev2", 00:22:00.363 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:00.363 "is_configured": true, 00:22:00.363 "data_offset": 2048, 00:22:00.363 "data_size": 63488 00:22:00.363 } 00:22:00.363 ] 00:22:00.363 }' 00:22:00.363 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:00.363 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:00.363 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:00.363 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:00.363 07:57:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:00.623 [2024-07-15 07:57:45.135895] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:00.623 [2024-07-15 07:57:45.153340] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:00.623 [2024-07-15 07:57:45.253622] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:00.623 [2024-07-15 07:57:45.267928] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:00.623 [2024-07-15 07:57:45.267946] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:00.623 [2024-07-15 07:57:45.267952] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:00.623 [2024-07-15 07:57:45.284820] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x26a1260 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.623 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.883 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.883 "name": "raid_bdev1", 00:22:00.883 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:00.883 "strip_size_kb": 0, 00:22:00.883 "state": "online", 00:22:00.883 "raid_level": "raid1", 00:22:00.883 "superblock": true, 00:22:00.883 "num_base_bdevs": 2, 00:22:00.883 "num_base_bdevs_discovered": 1, 00:22:00.883 "num_base_bdevs_operational": 1, 00:22:00.883 "base_bdevs_list": [ 00:22:00.883 { 00:22:00.883 "name": null, 00:22:00.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.883 "is_configured": false, 00:22:00.883 "data_offset": 2048, 00:22:00.883 "data_size": 63488 00:22:00.883 }, 00:22:00.883 { 00:22:00.883 "name": "BaseBdev2", 00:22:00.883 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:00.883 "is_configured": true, 00:22:00.883 "data_offset": 2048, 00:22:00.883 "data_size": 63488 00:22:00.883 } 00:22:00.883 ] 00:22:00.883 }' 00:22:00.883 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.883 07:57:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:01.451 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:01.451 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:01.451 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:01.451 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:01.451 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:01.451 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.451 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.710 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:01.710 "name": "raid_bdev1", 00:22:01.710 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:01.710 "strip_size_kb": 0, 00:22:01.710 "state": "online", 00:22:01.710 "raid_level": "raid1", 00:22:01.710 "superblock": true, 00:22:01.710 "num_base_bdevs": 2, 00:22:01.710 "num_base_bdevs_discovered": 1, 00:22:01.710 "num_base_bdevs_operational": 1, 00:22:01.710 "base_bdevs_list": [ 00:22:01.710 { 00:22:01.710 "name": null, 00:22:01.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.710 "is_configured": false, 00:22:01.710 "data_offset": 2048, 00:22:01.710 "data_size": 63488 00:22:01.710 }, 00:22:01.710 { 00:22:01.710 "name": "BaseBdev2", 00:22:01.710 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:01.710 "is_configured": true, 00:22:01.710 "data_offset": 2048, 00:22:01.710 "data_size": 63488 00:22:01.710 } 00:22:01.710 ] 00:22:01.710 }' 00:22:01.710 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:01.710 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:01.710 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:01.710 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:01.710 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:01.970 [2024-07-15 07:57:46.584616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:01.970 07:57:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:01.970 [2024-07-15 07:57:46.656698] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25199e0 00:22:01.970 [2024-07-15 07:57:46.657827] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:02.229 [2024-07-15 07:57:46.782854] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:02.488 [2024-07-15 07:57:46.998571] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:02.488 [2024-07-15 07:57:46.998689] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:02.748 [2024-07-15 07:57:47.443140] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:02.748 [2024-07-15 07:57:47.443260] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:03.008 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:03.008 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:03.008 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:03.008 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:03.008 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:03.008 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.008 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.008 [2024-07-15 07:57:47.673280] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:03.267 [2024-07-15 07:57:47.795510] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:03.267 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:03.267 "name": "raid_bdev1", 00:22:03.267 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:03.267 "strip_size_kb": 0, 00:22:03.267 "state": "online", 00:22:03.267 "raid_level": "raid1", 00:22:03.267 "superblock": true, 00:22:03.267 "num_base_bdevs": 2, 00:22:03.267 "num_base_bdevs_discovered": 2, 00:22:03.267 "num_base_bdevs_operational": 2, 00:22:03.267 "process": { 00:22:03.267 "type": "rebuild", 00:22:03.267 "target": "spare", 00:22:03.267 "progress": { 00:22:03.267 "blocks": 16384, 00:22:03.267 "percent": 25 00:22:03.267 } 00:22:03.267 }, 00:22:03.267 "base_bdevs_list": [ 00:22:03.267 { 00:22:03.267 "name": "spare", 00:22:03.267 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:03.267 "is_configured": true, 00:22:03.267 "data_offset": 2048, 00:22:03.268 "data_size": 63488 00:22:03.268 }, 00:22:03.268 { 00:22:03.268 "name": "BaseBdev2", 00:22:03.268 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:03.268 "is_configured": true, 00:22:03.268 "data_offset": 2048, 00:22:03.268 "data_size": 63488 00:22:03.268 } 00:22:03.268 ] 00:22:03.268 }' 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:03.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=733 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.268 07:57:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.527 07:57:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:03.527 "name": "raid_bdev1", 00:22:03.527 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:03.527 "strip_size_kb": 0, 00:22:03.527 "state": "online", 00:22:03.527 "raid_level": "raid1", 00:22:03.527 "superblock": true, 00:22:03.527 "num_base_bdevs": 2, 00:22:03.527 "num_base_bdevs_discovered": 2, 00:22:03.527 "num_base_bdevs_operational": 2, 00:22:03.527 "process": { 00:22:03.527 "type": "rebuild", 00:22:03.527 "target": "spare", 00:22:03.527 "progress": { 00:22:03.527 "blocks": 18432, 00:22:03.527 "percent": 29 00:22:03.527 } 00:22:03.527 }, 00:22:03.527 "base_bdevs_list": [ 00:22:03.527 { 00:22:03.527 "name": "spare", 00:22:03.527 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:03.527 "is_configured": true, 00:22:03.527 "data_offset": 2048, 00:22:03.527 "data_size": 63488 00:22:03.527 }, 00:22:03.527 { 00:22:03.527 "name": "BaseBdev2", 00:22:03.527 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:03.527 "is_configured": true, 00:22:03.527 "data_offset": 2048, 00:22:03.527 "data_size": 63488 00:22:03.527 } 00:22:03.527 ] 00:22:03.527 }' 00:22:03.527 07:57:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:03.527 07:57:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:03.527 07:57:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:03.527 07:57:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:03.527 07:57:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:04.098 [2024-07-15 07:57:48.598798] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:04.358 [2024-07-15 07:57:49.043809] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:04.358 [2024-07-15 07:57:49.043963] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:04.618 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:04.618 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:04.618 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:04.618 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:04.618 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:04.618 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:04.618 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.618 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.878 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:04.878 "name": "raid_bdev1", 00:22:04.878 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:04.878 "strip_size_kb": 0, 00:22:04.878 "state": "online", 00:22:04.878 "raid_level": "raid1", 00:22:04.878 "superblock": true, 00:22:04.878 "num_base_bdevs": 2, 00:22:04.878 "num_base_bdevs_discovered": 2, 00:22:04.878 "num_base_bdevs_operational": 2, 00:22:04.878 "process": { 00:22:04.878 "type": "rebuild", 00:22:04.878 "target": "spare", 00:22:04.878 "progress": { 00:22:04.878 "blocks": 38912, 00:22:04.878 "percent": 61 00:22:04.878 } 00:22:04.878 }, 00:22:04.878 "base_bdevs_list": [ 00:22:04.878 { 00:22:04.878 "name": "spare", 00:22:04.878 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:04.878 "is_configured": true, 00:22:04.878 "data_offset": 2048, 00:22:04.878 "data_size": 63488 00:22:04.878 }, 00:22:04.878 { 00:22:04.878 "name": "BaseBdev2", 00:22:04.878 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:04.878 "is_configured": true, 00:22:04.878 "data_offset": 2048, 00:22:04.878 "data_size": 63488 00:22:04.878 } 00:22:04.878 ] 00:22:04.878 }' 00:22:04.878 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:04.878 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:04.878 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:04.878 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:04.878 07:57:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:05.463 [2024-07-15 07:57:50.064378] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:22:05.768 [2024-07-15 07:57:50.281452] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.028 [2024-07-15 07:57:50.603394] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:06.028 [2024-07-15 07:57:50.603645] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:06.028 "name": "raid_bdev1", 00:22:06.028 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:06.028 "strip_size_kb": 0, 00:22:06.028 "state": "online", 00:22:06.028 "raid_level": "raid1", 00:22:06.028 "superblock": true, 00:22:06.028 "num_base_bdevs": 2, 00:22:06.028 "num_base_bdevs_discovered": 2, 00:22:06.028 "num_base_bdevs_operational": 2, 00:22:06.028 "process": { 00:22:06.028 "type": "rebuild", 00:22:06.028 "target": "spare", 00:22:06.028 "progress": { 00:22:06.028 "blocks": 57344, 00:22:06.028 "percent": 90 00:22:06.028 } 00:22:06.028 }, 00:22:06.028 "base_bdevs_list": [ 00:22:06.028 { 00:22:06.028 "name": "spare", 00:22:06.028 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:06.028 "is_configured": true, 00:22:06.028 "data_offset": 2048, 00:22:06.028 "data_size": 63488 00:22:06.028 }, 00:22:06.028 { 00:22:06.028 "name": "BaseBdev2", 00:22:06.028 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:06.028 "is_configured": true, 00:22:06.028 "data_offset": 2048, 00:22:06.028 "data_size": 63488 00:22:06.028 } 00:22:06.028 ] 00:22:06.028 }' 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:06.028 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:06.288 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:06.288 07:57:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:06.549 [2024-07-15 07:57:51.049257] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:06.549 [2024-07-15 07:57:51.149578] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:06.549 [2024-07-15 07:57:51.157315] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:07.119 07:57:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:07.119 07:57:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:07.119 07:57:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:07.119 07:57:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:07.119 07:57:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:07.119 07:57:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:07.119 07:57:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.119 07:57:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:07.379 "name": "raid_bdev1", 00:22:07.379 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:07.379 "strip_size_kb": 0, 00:22:07.379 "state": "online", 00:22:07.379 "raid_level": "raid1", 00:22:07.379 "superblock": true, 00:22:07.379 "num_base_bdevs": 2, 00:22:07.379 "num_base_bdevs_discovered": 2, 00:22:07.379 "num_base_bdevs_operational": 2, 00:22:07.379 "base_bdevs_list": [ 00:22:07.379 { 00:22:07.379 "name": "spare", 00:22:07.379 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:07.379 "is_configured": true, 00:22:07.379 "data_offset": 2048, 00:22:07.379 "data_size": 63488 00:22:07.379 }, 00:22:07.379 { 00:22:07.379 "name": "BaseBdev2", 00:22:07.379 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:07.379 "is_configured": true, 00:22:07.379 "data_offset": 2048, 00:22:07.379 "data_size": 63488 00:22:07.379 } 00:22:07.379 ] 00:22:07.379 }' 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.379 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.948 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:07.948 "name": "raid_bdev1", 00:22:07.948 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:07.948 "strip_size_kb": 0, 00:22:07.948 "state": "online", 00:22:07.948 "raid_level": "raid1", 00:22:07.948 "superblock": true, 00:22:07.948 "num_base_bdevs": 2, 00:22:07.948 "num_base_bdevs_discovered": 2, 00:22:07.948 "num_base_bdevs_operational": 2, 00:22:07.948 "base_bdevs_list": [ 00:22:07.948 { 00:22:07.948 "name": "spare", 00:22:07.948 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:07.948 "is_configured": true, 00:22:07.948 "data_offset": 2048, 00:22:07.948 "data_size": 63488 00:22:07.948 }, 00:22:07.948 { 00:22:07.948 "name": "BaseBdev2", 00:22:07.948 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:07.948 "is_configured": true, 00:22:07.948 "data_offset": 2048, 00:22:07.948 "data_size": 63488 00:22:07.948 } 00:22:07.948 ] 00:22:07.948 }' 00:22:07.948 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.209 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.469 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.469 "name": "raid_bdev1", 00:22:08.469 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:08.469 "strip_size_kb": 0, 00:22:08.469 "state": "online", 00:22:08.469 "raid_level": "raid1", 00:22:08.469 "superblock": true, 00:22:08.469 "num_base_bdevs": 2, 00:22:08.469 "num_base_bdevs_discovered": 2, 00:22:08.469 "num_base_bdevs_operational": 2, 00:22:08.469 "base_bdevs_list": [ 00:22:08.469 { 00:22:08.469 "name": "spare", 00:22:08.469 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:08.469 "is_configured": true, 00:22:08.469 "data_offset": 2048, 00:22:08.469 "data_size": 63488 00:22:08.469 }, 00:22:08.469 { 00:22:08.469 "name": "BaseBdev2", 00:22:08.469 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:08.469 "is_configured": true, 00:22:08.469 "data_offset": 2048, 00:22:08.469 "data_size": 63488 00:22:08.469 } 00:22:08.469 ] 00:22:08.469 }' 00:22:08.469 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.469 07:57:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:09.038 07:57:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:09.038 [2024-07-15 07:57:53.750518] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:09.038 [2024-07-15 07:57:53.750541] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:09.404 00:22:09.404 Latency(us) 00:22:09.404 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:09.404 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:09.404 raid_bdev1 : 11.24 102.08 306.24 0.00 0.00 13497.88 245.76 108083.99 00:22:09.404 =================================================================================================================== 00:22:09.404 Total : 102.08 306.24 0.00 0.00 13497.88 245.76 108083.99 00:22:09.404 [2024-07-15 07:57:53.854036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.404 [2024-07-15 07:57:53.854059] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:09.404 [2024-07-15 07:57:53.854114] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:09.404 [2024-07-15 07:57:53.854121] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26a2390 name raid_bdev1, state offline 00:22:09.404 0 00:22:09.404 07:57:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.404 07:57:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:09.404 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:09.664 /dev/nbd0 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:09.664 1+0 records in 00:22:09.664 1+0 records out 00:22:09.664 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279256 s, 14.7 MB/s 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:09.664 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:22:09.924 /dev/nbd1 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:09.924 1+0 records in 00:22:09.924 1+0 records out 00:22:09.924 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000199005 s, 20.6 MB/s 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:09.924 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:10.184 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:10.443 07:57:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:10.443 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:10.703 [2024-07-15 07:57:55.376807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:10.703 [2024-07-15 07:57:55.376839] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.703 [2024-07-15 07:57:55.376851] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x251a7f0 00:22:10.703 [2024-07-15 07:57:55.376858] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.703 [2024-07-15 07:57:55.378152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.703 [2024-07-15 07:57:55.378173] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:10.703 [2024-07-15 07:57:55.378228] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:10.703 [2024-07-15 07:57:55.378247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:10.703 [2024-07-15 07:57:55.378323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:10.703 spare 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.703 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.963 [2024-07-15 07:57:55.478613] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25018c0 00:22:10.963 [2024-07-15 07:57:55.478622] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:10.963 [2024-07-15 07:57:55.478771] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a0db0 00:22:10.963 [2024-07-15 07:57:55.478882] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25018c0 00:22:10.963 [2024-07-15 07:57:55.478888] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25018c0 00:22:10.963 [2024-07-15 07:57:55.478967] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:10.963 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.963 "name": "raid_bdev1", 00:22:10.963 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:10.963 "strip_size_kb": 0, 00:22:10.963 "state": "online", 00:22:10.963 "raid_level": "raid1", 00:22:10.963 "superblock": true, 00:22:10.963 "num_base_bdevs": 2, 00:22:10.963 "num_base_bdevs_discovered": 2, 00:22:10.963 "num_base_bdevs_operational": 2, 00:22:10.963 "base_bdevs_list": [ 00:22:10.963 { 00:22:10.963 "name": "spare", 00:22:10.963 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:10.963 "is_configured": true, 00:22:10.963 "data_offset": 2048, 00:22:10.963 "data_size": 63488 00:22:10.963 }, 00:22:10.963 { 00:22:10.963 "name": "BaseBdev2", 00:22:10.963 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:10.963 "is_configured": true, 00:22:10.963 "data_offset": 2048, 00:22:10.963 "data_size": 63488 00:22:10.963 } 00:22:10.963 ] 00:22:10.963 }' 00:22:10.963 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.963 07:57:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:11.532 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:11.532 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:11.532 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:11.532 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:11.532 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:11.532 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.532 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.792 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.792 "name": "raid_bdev1", 00:22:11.792 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:11.792 "strip_size_kb": 0, 00:22:11.792 "state": "online", 00:22:11.792 "raid_level": "raid1", 00:22:11.792 "superblock": true, 00:22:11.792 "num_base_bdevs": 2, 00:22:11.792 "num_base_bdevs_discovered": 2, 00:22:11.792 "num_base_bdevs_operational": 2, 00:22:11.792 "base_bdevs_list": [ 00:22:11.792 { 00:22:11.792 "name": "spare", 00:22:11.792 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:11.792 "is_configured": true, 00:22:11.792 "data_offset": 2048, 00:22:11.792 "data_size": 63488 00:22:11.792 }, 00:22:11.792 { 00:22:11.792 "name": "BaseBdev2", 00:22:11.792 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:11.792 "is_configured": true, 00:22:11.792 "data_offset": 2048, 00:22:11.792 "data_size": 63488 00:22:11.792 } 00:22:11.792 ] 00:22:11.792 }' 00:22:11.792 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.792 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:11.792 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.792 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:11.792 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.792 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:12.052 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:12.052 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:12.311 [2024-07-15 07:57:56.864824] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.311 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.312 07:57:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.572 07:57:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.572 "name": "raid_bdev1", 00:22:12.572 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:12.572 "strip_size_kb": 0, 00:22:12.572 "state": "online", 00:22:12.572 "raid_level": "raid1", 00:22:12.572 "superblock": true, 00:22:12.572 "num_base_bdevs": 2, 00:22:12.572 "num_base_bdevs_discovered": 1, 00:22:12.572 "num_base_bdevs_operational": 1, 00:22:12.572 "base_bdevs_list": [ 00:22:12.572 { 00:22:12.572 "name": null, 00:22:12.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.572 "is_configured": false, 00:22:12.572 "data_offset": 2048, 00:22:12.572 "data_size": 63488 00:22:12.572 }, 00:22:12.572 { 00:22:12.572 "name": "BaseBdev2", 00:22:12.572 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:12.572 "is_configured": true, 00:22:12.572 "data_offset": 2048, 00:22:12.572 "data_size": 63488 00:22:12.572 } 00:22:12.572 ] 00:22:12.572 }' 00:22:12.572 07:57:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.572 07:57:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:13.143 07:57:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:13.143 [2024-07-15 07:57:57.823369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:13.143 [2024-07-15 07:57:57.823477] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:13.143 [2024-07-15 07:57:57.823487] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:13.143 [2024-07-15 07:57:57.823505] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:13.143 [2024-07-15 07:57:57.827040] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ab170 00:22:13.144 [2024-07-15 07:57:57.828579] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:13.144 07:57:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:14.529 07:57:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:14.529 07:57:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:14.529 07:57:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:14.529 07:57:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:14.529 07:57:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:14.529 07:57:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.529 07:57:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.529 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:14.529 "name": "raid_bdev1", 00:22:14.529 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:14.529 "strip_size_kb": 0, 00:22:14.529 "state": "online", 00:22:14.529 "raid_level": "raid1", 00:22:14.529 "superblock": true, 00:22:14.529 "num_base_bdevs": 2, 00:22:14.529 "num_base_bdevs_discovered": 2, 00:22:14.529 "num_base_bdevs_operational": 2, 00:22:14.529 "process": { 00:22:14.529 "type": "rebuild", 00:22:14.529 "target": "spare", 00:22:14.529 "progress": { 00:22:14.529 "blocks": 22528, 00:22:14.529 "percent": 35 00:22:14.529 } 00:22:14.529 }, 00:22:14.529 "base_bdevs_list": [ 00:22:14.529 { 00:22:14.529 "name": "spare", 00:22:14.529 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:14.529 "is_configured": true, 00:22:14.529 "data_offset": 2048, 00:22:14.529 "data_size": 63488 00:22:14.529 }, 00:22:14.529 { 00:22:14.529 "name": "BaseBdev2", 00:22:14.529 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:14.529 "is_configured": true, 00:22:14.529 "data_offset": 2048, 00:22:14.529 "data_size": 63488 00:22:14.529 } 00:22:14.529 ] 00:22:14.529 }' 00:22:14.529 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:14.529 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:14.529 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:14.529 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:14.529 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:14.790 [2024-07-15 07:57:59.329386] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:14.790 [2024-07-15 07:57:59.337358] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:14.790 [2024-07-15 07:57:59.337387] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:14.790 [2024-07-15 07:57:59.337396] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:14.790 [2024-07-15 07:57:59.337400] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.790 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.050 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.050 "name": "raid_bdev1", 00:22:15.050 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:15.050 "strip_size_kb": 0, 00:22:15.050 "state": "online", 00:22:15.050 "raid_level": "raid1", 00:22:15.050 "superblock": true, 00:22:15.050 "num_base_bdevs": 2, 00:22:15.050 "num_base_bdevs_discovered": 1, 00:22:15.050 "num_base_bdevs_operational": 1, 00:22:15.050 "base_bdevs_list": [ 00:22:15.050 { 00:22:15.050 "name": null, 00:22:15.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.050 "is_configured": false, 00:22:15.050 "data_offset": 2048, 00:22:15.050 "data_size": 63488 00:22:15.050 }, 00:22:15.050 { 00:22:15.050 "name": "BaseBdev2", 00:22:15.050 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:15.050 "is_configured": true, 00:22:15.050 "data_offset": 2048, 00:22:15.050 "data_size": 63488 00:22:15.050 } 00:22:15.050 ] 00:22:15.050 }' 00:22:15.050 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.050 07:57:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:15.990 07:58:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:16.250 [2024-07-15 07:58:01.001805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:16.250 [2024-07-15 07:58:01.001843] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:16.250 [2024-07-15 07:58:01.001860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a16c0 00:22:16.250 [2024-07-15 07:58:01.001867] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:16.250 [2024-07-15 07:58:01.002161] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:16.250 [2024-07-15 07:58:01.002175] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:16.250 [2024-07-15 07:58:01.002236] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:16.250 [2024-07-15 07:58:01.002244] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:16.250 [2024-07-15 07:58:01.002251] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:16.250 [2024-07-15 07:58:01.002262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:16.250 [2024-07-15 07:58:01.005770] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2507a00 00:22:16.510 [2024-07-15 07:58:01.006896] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:16.510 spare 00:22:16.510 07:58:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:17.448 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:17.448 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:17.448 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:17.448 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:17.448 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:17.448 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.448 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.019 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:18.019 "name": "raid_bdev1", 00:22:18.019 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:18.019 "strip_size_kb": 0, 00:22:18.019 "state": "online", 00:22:18.019 "raid_level": "raid1", 00:22:18.019 "superblock": true, 00:22:18.019 "num_base_bdevs": 2, 00:22:18.019 "num_base_bdevs_discovered": 2, 00:22:18.019 "num_base_bdevs_operational": 2, 00:22:18.019 "process": { 00:22:18.019 "type": "rebuild", 00:22:18.019 "target": "spare", 00:22:18.019 "progress": { 00:22:18.019 "blocks": 30720, 00:22:18.019 "percent": 48 00:22:18.019 } 00:22:18.019 }, 00:22:18.019 "base_bdevs_list": [ 00:22:18.019 { 00:22:18.019 "name": "spare", 00:22:18.019 "uuid": "f25373e0-c5c5-5e05-8fd3-4e6da824dd12", 00:22:18.019 "is_configured": true, 00:22:18.019 "data_offset": 2048, 00:22:18.019 "data_size": 63488 00:22:18.019 }, 00:22:18.019 { 00:22:18.019 "name": "BaseBdev2", 00:22:18.019 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:18.019 "is_configured": true, 00:22:18.019 "data_offset": 2048, 00:22:18.019 "data_size": 63488 00:22:18.019 } 00:22:18.019 ] 00:22:18.019 }' 00:22:18.019 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:18.019 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:18.019 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:18.019 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:18.019 07:58:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:18.589 [2024-07-15 07:58:03.187880] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:18.589 [2024-07-15 07:58:03.219904] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:18.589 [2024-07-15 07:58:03.219943] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:18.589 [2024-07-15 07:58:03.219953] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:18.589 [2024-07-15 07:58:03.219958] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.589 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.848 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.848 "name": "raid_bdev1", 00:22:18.848 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:18.848 "strip_size_kb": 0, 00:22:18.848 "state": "online", 00:22:18.848 "raid_level": "raid1", 00:22:18.848 "superblock": true, 00:22:18.848 "num_base_bdevs": 2, 00:22:18.848 "num_base_bdevs_discovered": 1, 00:22:18.848 "num_base_bdevs_operational": 1, 00:22:18.848 "base_bdevs_list": [ 00:22:18.848 { 00:22:18.848 "name": null, 00:22:18.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.848 "is_configured": false, 00:22:18.848 "data_offset": 2048, 00:22:18.848 "data_size": 63488 00:22:18.848 }, 00:22:18.848 { 00:22:18.848 "name": "BaseBdev2", 00:22:18.848 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:18.848 "is_configured": true, 00:22:18.848 "data_offset": 2048, 00:22:18.848 "data_size": 63488 00:22:18.848 } 00:22:18.848 ] 00:22:18.848 }' 00:22:18.848 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.848 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:19.417 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:19.417 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:19.417 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:19.417 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:19.417 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:19.417 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.417 07:58:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.676 07:58:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:19.676 "name": "raid_bdev1", 00:22:19.676 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:19.676 "strip_size_kb": 0, 00:22:19.676 "state": "online", 00:22:19.676 "raid_level": "raid1", 00:22:19.676 "superblock": true, 00:22:19.676 "num_base_bdevs": 2, 00:22:19.676 "num_base_bdevs_discovered": 1, 00:22:19.676 "num_base_bdevs_operational": 1, 00:22:19.676 "base_bdevs_list": [ 00:22:19.676 { 00:22:19.676 "name": null, 00:22:19.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.676 "is_configured": false, 00:22:19.676 "data_offset": 2048, 00:22:19.676 "data_size": 63488 00:22:19.676 }, 00:22:19.676 { 00:22:19.676 "name": "BaseBdev2", 00:22:19.676 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:19.676 "is_configured": true, 00:22:19.676 "data_offset": 2048, 00:22:19.676 "data_size": 63488 00:22:19.676 } 00:22:19.676 ] 00:22:19.676 }' 00:22:19.676 07:58:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:19.676 07:58:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:19.676 07:58:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:19.676 07:58:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:19.676 07:58:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:19.936 07:58:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:19.937 [2024-07-15 07:58:04.627698] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:19.937 [2024-07-15 07:58:04.627731] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.937 [2024-07-15 07:58:04.627743] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2507f60 00:22:19.937 [2024-07-15 07:58:04.627750] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.937 [2024-07-15 07:58:04.628016] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.937 [2024-07-15 07:58:04.628029] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:19.937 [2024-07-15 07:58:04.628072] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:19.937 [2024-07-15 07:58:04.628079] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:19.937 [2024-07-15 07:58:04.628085] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:19.937 BaseBdev1 00:22:19.937 07:58:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.320 "name": "raid_bdev1", 00:22:21.320 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:21.320 "strip_size_kb": 0, 00:22:21.320 "state": "online", 00:22:21.320 "raid_level": "raid1", 00:22:21.320 "superblock": true, 00:22:21.320 "num_base_bdevs": 2, 00:22:21.320 "num_base_bdevs_discovered": 1, 00:22:21.320 "num_base_bdevs_operational": 1, 00:22:21.320 "base_bdevs_list": [ 00:22:21.320 { 00:22:21.320 "name": null, 00:22:21.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.320 "is_configured": false, 00:22:21.320 "data_offset": 2048, 00:22:21.320 "data_size": 63488 00:22:21.320 }, 00:22:21.320 { 00:22:21.320 "name": "BaseBdev2", 00:22:21.320 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:21.320 "is_configured": true, 00:22:21.320 "data_offset": 2048, 00:22:21.320 "data_size": 63488 00:22:21.320 } 00:22:21.320 ] 00:22:21.320 }' 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.320 07:58:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:21.581 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:21.581 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:21.581 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:21.581 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:21.581 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:21.581 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.581 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.840 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:21.840 "name": "raid_bdev1", 00:22:21.840 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:21.840 "strip_size_kb": 0, 00:22:21.840 "state": "online", 00:22:21.840 "raid_level": "raid1", 00:22:21.840 "superblock": true, 00:22:21.840 "num_base_bdevs": 2, 00:22:21.840 "num_base_bdevs_discovered": 1, 00:22:21.840 "num_base_bdevs_operational": 1, 00:22:21.840 "base_bdevs_list": [ 00:22:21.840 { 00:22:21.840 "name": null, 00:22:21.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.840 "is_configured": false, 00:22:21.840 "data_offset": 2048, 00:22:21.840 "data_size": 63488 00:22:21.840 }, 00:22:21.840 { 00:22:21.840 "name": "BaseBdev2", 00:22:21.840 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:21.840 "is_configured": true, 00:22:21.840 "data_offset": 2048, 00:22:21.840 "data_size": 63488 00:22:21.840 } 00:22:21.840 ] 00:22:21.840 }' 00:22:21.840 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:21.840 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:21.840 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:22.099 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:22.100 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:22.100 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:22.100 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:22.100 [2024-07-15 07:58:06.797458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:22.100 [2024-07-15 07:58:06.797546] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:22.100 [2024-07-15 07:58:06.797555] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:22.100 request: 00:22:22.100 { 00:22:22.100 "base_bdev": "BaseBdev1", 00:22:22.100 "raid_bdev": "raid_bdev1", 00:22:22.100 "method": "bdev_raid_add_base_bdev", 00:22:22.100 "req_id": 1 00:22:22.100 } 00:22:22.100 Got JSON-RPC error response 00:22:22.100 response: 00:22:22.100 { 00:22:22.100 "code": -22, 00:22:22.100 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:22.100 } 00:22:22.100 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:22:22.100 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:22.100 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:22.100 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:22.100 07:58:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.481 07:58:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.481 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.481 "name": "raid_bdev1", 00:22:23.481 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:23.481 "strip_size_kb": 0, 00:22:23.481 "state": "online", 00:22:23.481 "raid_level": "raid1", 00:22:23.481 "superblock": true, 00:22:23.481 "num_base_bdevs": 2, 00:22:23.481 "num_base_bdevs_discovered": 1, 00:22:23.481 "num_base_bdevs_operational": 1, 00:22:23.481 "base_bdevs_list": [ 00:22:23.481 { 00:22:23.481 "name": null, 00:22:23.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.481 "is_configured": false, 00:22:23.481 "data_offset": 2048, 00:22:23.481 "data_size": 63488 00:22:23.481 }, 00:22:23.481 { 00:22:23.481 "name": "BaseBdev2", 00:22:23.481 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:23.481 "is_configured": true, 00:22:23.481 "data_offset": 2048, 00:22:23.481 "data_size": 63488 00:22:23.481 } 00:22:23.481 ] 00:22:23.481 }' 00:22:23.481 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.481 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:24.052 "name": "raid_bdev1", 00:22:24.052 "uuid": "6ec10730-1db2-4500-a0d9-5d60aaa339f0", 00:22:24.052 "strip_size_kb": 0, 00:22:24.052 "state": "online", 00:22:24.052 "raid_level": "raid1", 00:22:24.052 "superblock": true, 00:22:24.052 "num_base_bdevs": 2, 00:22:24.052 "num_base_bdevs_discovered": 1, 00:22:24.052 "num_base_bdevs_operational": 1, 00:22:24.052 "base_bdevs_list": [ 00:22:24.052 { 00:22:24.052 "name": null, 00:22:24.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.052 "is_configured": false, 00:22:24.052 "data_offset": 2048, 00:22:24.052 "data_size": 63488 00:22:24.052 }, 00:22:24.052 { 00:22:24.052 "name": "BaseBdev2", 00:22:24.052 "uuid": "56284980-36d0-5990-8def-4edd55eea4c8", 00:22:24.052 "is_configured": true, 00:22:24.052 "data_offset": 2048, 00:22:24.052 "data_size": 63488 00:22:24.052 } 00:22:24.052 ] 00:22:24.052 }' 00:22:24.052 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1716636 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1716636 ']' 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1716636 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1716636 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1716636' 00:22:24.313 killing process with pid 1716636 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1716636 00:22:24.313 Received shutdown signal, test time was about 26.261643 seconds 00:22:24.313 00:22:24.313 Latency(us) 00:22:24.313 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:24.313 =================================================================================================================== 00:22:24.313 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:22:24.313 [2024-07-15 07:58:08.911446] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:24.313 [2024-07-15 07:58:08.911513] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:24.313 [2024-07-15 07:58:08.911544] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:24.313 [2024-07-15 07:58:08.911550] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25018c0 name raid_bdev1, state offline 00:22:24.313 07:58:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1716636 00:22:24.313 [2024-07-15 07:58:08.923343] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:24.313 07:58:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:22:24.313 00:22:24.313 real 0m30.024s 00:22:24.313 user 0m47.731s 00:22:24.313 sys 0m3.218s 00:22:24.313 07:58:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:24.313 07:58:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:24.313 ************************************ 00:22:24.313 END TEST raid_rebuild_test_sb_io 00:22:24.313 ************************************ 00:22:24.573 07:58:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:24.574 07:58:09 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:22:24.574 07:58:09 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:22:24.574 07:58:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:24.574 07:58:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:24.574 07:58:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:24.574 ************************************ 00:22:24.574 START TEST raid_rebuild_test 00:22:24.574 ************************************ 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1722236 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1722236 /var/tmp/spdk-raid.sock 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1722236 ']' 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:24.574 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:24.574 07:58:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:24.574 [2024-07-15 07:58:09.192560] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:22:24.574 [2024-07-15 07:58:09.192606] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1722236 ] 00:22:24.574 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:24.574 Zero copy mechanism will not be used. 00:22:24.574 [2024-07-15 07:58:09.279094] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:24.834 [2024-07-15 07:58:09.342027] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:24.834 [2024-07-15 07:58:09.384811] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:24.834 [2024-07-15 07:58:09.384833] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:25.403 07:58:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:25.403 07:58:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:22:25.403 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:25.403 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:25.662 BaseBdev1_malloc 00:22:25.662 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:25.662 [2024-07-15 07:58:10.398443] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:25.662 [2024-07-15 07:58:10.398477] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:25.662 [2024-07-15 07:58:10.398490] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17e1d30 00:22:25.662 [2024-07-15 07:58:10.398500] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:25.662 [2024-07-15 07:58:10.399789] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:25.662 [2024-07-15 07:58:10.399810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:25.662 BaseBdev1 00:22:25.662 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:25.662 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:25.921 BaseBdev2_malloc 00:22:25.921 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:26.180 [2024-07-15 07:58:10.773495] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:26.180 [2024-07-15 07:58:10.773522] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.180 [2024-07-15 07:58:10.773532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1994c60 00:22:26.180 [2024-07-15 07:58:10.773539] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.180 [2024-07-15 07:58:10.774745] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.180 [2024-07-15 07:58:10.774764] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:26.180 BaseBdev2 00:22:26.180 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:26.180 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:26.440 BaseBdev3_malloc 00:22:26.440 07:58:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:26.440 [2024-07-15 07:58:11.156403] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:26.440 [2024-07-15 07:58:11.156432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.440 [2024-07-15 07:58:11.156443] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1979b90 00:22:26.440 [2024-07-15 07:58:11.156450] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.440 [2024-07-15 07:58:11.157632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.440 [2024-07-15 07:58:11.157651] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:26.440 BaseBdev3 00:22:26.440 07:58:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:26.440 07:58:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:26.699 BaseBdev4_malloc 00:22:26.699 07:58:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:26.959 [2024-07-15 07:58:11.559198] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:26.959 [2024-07-15 07:58:11.559225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.959 [2024-07-15 07:58:11.559236] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17e28c0 00:22:26.959 [2024-07-15 07:58:11.559242] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.959 [2024-07-15 07:58:11.560414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.960 [2024-07-15 07:58:11.560433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:26.960 BaseBdev4 00:22:26.960 07:58:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:27.219 spare_malloc 00:22:27.219 07:58:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:27.219 spare_delay 00:22:27.219 07:58:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:27.480 [2024-07-15 07:58:12.134426] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:27.480 [2024-07-15 07:58:12.134455] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.480 [2024-07-15 07:58:12.134466] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17daa80 00:22:27.480 [2024-07-15 07:58:12.134472] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.480 [2024-07-15 07:58:12.135678] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.480 [2024-07-15 07:58:12.135699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:27.480 spare 00:22:27.480 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:27.741 [2024-07-15 07:58:12.322919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:27.741 [2024-07-15 07:58:12.323923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:27.741 [2024-07-15 07:58:12.323963] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:27.741 [2024-07-15 07:58:12.323997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:27.741 [2024-07-15 07:58:12.324055] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17dbb30 00:22:27.741 [2024-07-15 07:58:12.324060] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:27.741 [2024-07-15 07:58:12.324217] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17dfbf0 00:22:27.741 [2024-07-15 07:58:12.324333] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17dbb30 00:22:27.741 [2024-07-15 07:58:12.324339] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17dbb30 00:22:27.741 [2024-07-15 07:58:12.324419] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.741 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.002 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.002 "name": "raid_bdev1", 00:22:28.002 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:28.002 "strip_size_kb": 0, 00:22:28.002 "state": "online", 00:22:28.002 "raid_level": "raid1", 00:22:28.002 "superblock": false, 00:22:28.002 "num_base_bdevs": 4, 00:22:28.002 "num_base_bdevs_discovered": 4, 00:22:28.002 "num_base_bdevs_operational": 4, 00:22:28.002 "base_bdevs_list": [ 00:22:28.002 { 00:22:28.002 "name": "BaseBdev1", 00:22:28.002 "uuid": "66af0f8b-ea31-55b2-926b-e848fd3cb576", 00:22:28.002 "is_configured": true, 00:22:28.002 "data_offset": 0, 00:22:28.002 "data_size": 65536 00:22:28.002 }, 00:22:28.002 { 00:22:28.002 "name": "BaseBdev2", 00:22:28.002 "uuid": "c688ad74-3d99-5427-beb2-ea5f2b6520da", 00:22:28.002 "is_configured": true, 00:22:28.002 "data_offset": 0, 00:22:28.002 "data_size": 65536 00:22:28.002 }, 00:22:28.002 { 00:22:28.002 "name": "BaseBdev3", 00:22:28.002 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:28.002 "is_configured": true, 00:22:28.002 "data_offset": 0, 00:22:28.002 "data_size": 65536 00:22:28.002 }, 00:22:28.002 { 00:22:28.002 "name": "BaseBdev4", 00:22:28.002 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:28.002 "is_configured": true, 00:22:28.002 "data_offset": 0, 00:22:28.002 "data_size": 65536 00:22:28.002 } 00:22:28.002 ] 00:22:28.002 }' 00:22:28.002 07:58:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.002 07:58:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:28.572 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:28.572 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:28.572 [2024-07-15 07:58:13.245476] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:28.572 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:28.572 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.572 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:28.831 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:29.090 [2024-07-15 07:58:13.630229] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17dcb00 00:22:29.090 /dev/nbd0 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:29.090 1+0 records in 00:22:29.090 1+0 records out 00:22:29.090 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259631 s, 15.8 MB/s 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:29.090 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:29.091 07:58:13 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:29.091 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:29.091 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:29.091 07:58:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:22:41.310 65536+0 records in 00:22:41.310 65536+0 records out 00:22:41.310 33554432 bytes (34 MB, 32 MiB) copied, 10.3662 s, 3.2 MB/s 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:41.310 [2024-07-15 07:58:24.245352] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:41.310 [2024-07-15 07:58:24.427514] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.310 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.310 "name": "raid_bdev1", 00:22:41.310 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:41.310 "strip_size_kb": 0, 00:22:41.310 "state": "online", 00:22:41.310 "raid_level": "raid1", 00:22:41.310 "superblock": false, 00:22:41.310 "num_base_bdevs": 4, 00:22:41.310 "num_base_bdevs_discovered": 3, 00:22:41.310 "num_base_bdevs_operational": 3, 00:22:41.310 "base_bdevs_list": [ 00:22:41.310 { 00:22:41.310 "name": null, 00:22:41.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.310 "is_configured": false, 00:22:41.310 "data_offset": 0, 00:22:41.310 "data_size": 65536 00:22:41.310 }, 00:22:41.310 { 00:22:41.310 "name": "BaseBdev2", 00:22:41.310 "uuid": "c688ad74-3d99-5427-beb2-ea5f2b6520da", 00:22:41.310 "is_configured": true, 00:22:41.310 "data_offset": 0, 00:22:41.310 "data_size": 65536 00:22:41.310 }, 00:22:41.310 { 00:22:41.310 "name": "BaseBdev3", 00:22:41.310 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:41.310 "is_configured": true, 00:22:41.310 "data_offset": 0, 00:22:41.310 "data_size": 65536 00:22:41.310 }, 00:22:41.311 { 00:22:41.311 "name": "BaseBdev4", 00:22:41.311 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:41.311 "is_configured": true, 00:22:41.311 "data_offset": 0, 00:22:41.311 "data_size": 65536 00:22:41.311 } 00:22:41.311 ] 00:22:41.311 }' 00:22:41.311 07:58:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.311 07:58:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:41.311 07:58:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:41.311 [2024-07-15 07:58:25.373919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:41.311 [2024-07-15 07:58:25.376684] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x197b8b0 00:22:41.311 [2024-07-15 07:58:25.378279] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:41.311 07:58:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:41.881 "name": "raid_bdev1", 00:22:41.881 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:41.881 "strip_size_kb": 0, 00:22:41.881 "state": "online", 00:22:41.881 "raid_level": "raid1", 00:22:41.881 "superblock": false, 00:22:41.881 "num_base_bdevs": 4, 00:22:41.881 "num_base_bdevs_discovered": 4, 00:22:41.881 "num_base_bdevs_operational": 4, 00:22:41.881 "process": { 00:22:41.881 "type": "rebuild", 00:22:41.881 "target": "spare", 00:22:41.881 "progress": { 00:22:41.881 "blocks": 22528, 00:22:41.881 "percent": 34 00:22:41.881 } 00:22:41.881 }, 00:22:41.881 "base_bdevs_list": [ 00:22:41.881 { 00:22:41.881 "name": "spare", 00:22:41.881 "uuid": "4cf03b54-633b-5c20-b61b-3f263157fd37", 00:22:41.881 "is_configured": true, 00:22:41.881 "data_offset": 0, 00:22:41.881 "data_size": 65536 00:22:41.881 }, 00:22:41.881 { 00:22:41.881 "name": "BaseBdev2", 00:22:41.881 "uuid": "c688ad74-3d99-5427-beb2-ea5f2b6520da", 00:22:41.881 "is_configured": true, 00:22:41.881 "data_offset": 0, 00:22:41.881 "data_size": 65536 00:22:41.881 }, 00:22:41.881 { 00:22:41.881 "name": "BaseBdev3", 00:22:41.881 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:41.881 "is_configured": true, 00:22:41.881 "data_offset": 0, 00:22:41.881 "data_size": 65536 00:22:41.881 }, 00:22:41.881 { 00:22:41.881 "name": "BaseBdev4", 00:22:41.881 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:41.881 "is_configured": true, 00:22:41.881 "data_offset": 0, 00:22:41.881 "data_size": 65536 00:22:41.881 } 00:22:41.881 ] 00:22:41.881 }' 00:22:41.881 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:42.142 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:42.142 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:42.142 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:42.142 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:42.142 [2024-07-15 07:58:26.867038] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:42.142 [2024-07-15 07:58:26.887082] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:42.142 [2024-07-15 07:58:26.887114] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:42.142 [2024-07-15 07:58:26.887125] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:42.142 [2024-07-15 07:58:26.887130] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.402 07:58:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.402 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.402 "name": "raid_bdev1", 00:22:42.402 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:42.402 "strip_size_kb": 0, 00:22:42.402 "state": "online", 00:22:42.402 "raid_level": "raid1", 00:22:42.402 "superblock": false, 00:22:42.402 "num_base_bdevs": 4, 00:22:42.402 "num_base_bdevs_discovered": 3, 00:22:42.402 "num_base_bdevs_operational": 3, 00:22:42.402 "base_bdevs_list": [ 00:22:42.402 { 00:22:42.402 "name": null, 00:22:42.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.402 "is_configured": false, 00:22:42.402 "data_offset": 0, 00:22:42.402 "data_size": 65536 00:22:42.402 }, 00:22:42.402 { 00:22:42.402 "name": "BaseBdev2", 00:22:42.402 "uuid": "c688ad74-3d99-5427-beb2-ea5f2b6520da", 00:22:42.402 "is_configured": true, 00:22:42.402 "data_offset": 0, 00:22:42.402 "data_size": 65536 00:22:42.402 }, 00:22:42.402 { 00:22:42.402 "name": "BaseBdev3", 00:22:42.402 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:42.402 "is_configured": true, 00:22:42.402 "data_offset": 0, 00:22:42.402 "data_size": 65536 00:22:42.402 }, 00:22:42.402 { 00:22:42.402 "name": "BaseBdev4", 00:22:42.402 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:42.402 "is_configured": true, 00:22:42.402 "data_offset": 0, 00:22:42.402 "data_size": 65536 00:22:42.402 } 00:22:42.402 ] 00:22:42.402 }' 00:22:42.402 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.402 07:58:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:42.973 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:42.973 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:42.973 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:42.973 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:42.973 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:42.973 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.973 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.233 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:43.233 "name": "raid_bdev1", 00:22:43.233 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:43.233 "strip_size_kb": 0, 00:22:43.233 "state": "online", 00:22:43.233 "raid_level": "raid1", 00:22:43.233 "superblock": false, 00:22:43.233 "num_base_bdevs": 4, 00:22:43.233 "num_base_bdevs_discovered": 3, 00:22:43.233 "num_base_bdevs_operational": 3, 00:22:43.233 "base_bdevs_list": [ 00:22:43.233 { 00:22:43.233 "name": null, 00:22:43.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:43.233 "is_configured": false, 00:22:43.233 "data_offset": 0, 00:22:43.233 "data_size": 65536 00:22:43.233 }, 00:22:43.233 { 00:22:43.233 "name": "BaseBdev2", 00:22:43.233 "uuid": "c688ad74-3d99-5427-beb2-ea5f2b6520da", 00:22:43.233 "is_configured": true, 00:22:43.233 "data_offset": 0, 00:22:43.233 "data_size": 65536 00:22:43.233 }, 00:22:43.233 { 00:22:43.233 "name": "BaseBdev3", 00:22:43.233 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:43.233 "is_configured": true, 00:22:43.233 "data_offset": 0, 00:22:43.233 "data_size": 65536 00:22:43.233 }, 00:22:43.233 { 00:22:43.233 "name": "BaseBdev4", 00:22:43.233 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:43.233 "is_configured": true, 00:22:43.233 "data_offset": 0, 00:22:43.233 "data_size": 65536 00:22:43.233 } 00:22:43.233 ] 00:22:43.233 }' 00:22:43.233 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:43.233 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:43.233 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:43.233 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:43.233 07:58:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:43.494 [2024-07-15 07:58:28.090130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:43.494 [2024-07-15 07:58:28.092943] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x197b880 00:22:43.494 [2024-07-15 07:58:28.094100] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:43.494 07:58:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:44.435 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:44.435 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:44.435 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:44.435 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:44.435 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:44.435 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.435 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:44.696 "name": "raid_bdev1", 00:22:44.696 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:44.696 "strip_size_kb": 0, 00:22:44.696 "state": "online", 00:22:44.696 "raid_level": "raid1", 00:22:44.696 "superblock": false, 00:22:44.696 "num_base_bdevs": 4, 00:22:44.696 "num_base_bdevs_discovered": 4, 00:22:44.696 "num_base_bdevs_operational": 4, 00:22:44.696 "process": { 00:22:44.696 "type": "rebuild", 00:22:44.696 "target": "spare", 00:22:44.696 "progress": { 00:22:44.696 "blocks": 22528, 00:22:44.696 "percent": 34 00:22:44.696 } 00:22:44.696 }, 00:22:44.696 "base_bdevs_list": [ 00:22:44.696 { 00:22:44.696 "name": "spare", 00:22:44.696 "uuid": "4cf03b54-633b-5c20-b61b-3f263157fd37", 00:22:44.696 "is_configured": true, 00:22:44.696 "data_offset": 0, 00:22:44.696 "data_size": 65536 00:22:44.696 }, 00:22:44.696 { 00:22:44.696 "name": "BaseBdev2", 00:22:44.696 "uuid": "c688ad74-3d99-5427-beb2-ea5f2b6520da", 00:22:44.696 "is_configured": true, 00:22:44.696 "data_offset": 0, 00:22:44.696 "data_size": 65536 00:22:44.696 }, 00:22:44.696 { 00:22:44.696 "name": "BaseBdev3", 00:22:44.696 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:44.696 "is_configured": true, 00:22:44.696 "data_offset": 0, 00:22:44.696 "data_size": 65536 00:22:44.696 }, 00:22:44.696 { 00:22:44.696 "name": "BaseBdev4", 00:22:44.696 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:44.696 "is_configured": true, 00:22:44.696 "data_offset": 0, 00:22:44.696 "data_size": 65536 00:22:44.696 } 00:22:44.696 ] 00:22:44.696 }' 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:22:44.696 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:44.967 [2024-07-15 07:58:29.570586] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:44.967 [2024-07-15 07:58:29.602938] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x197b880 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.967 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:45.255 "name": "raid_bdev1", 00:22:45.255 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:45.255 "strip_size_kb": 0, 00:22:45.255 "state": "online", 00:22:45.255 "raid_level": "raid1", 00:22:45.255 "superblock": false, 00:22:45.255 "num_base_bdevs": 4, 00:22:45.255 "num_base_bdevs_discovered": 3, 00:22:45.255 "num_base_bdevs_operational": 3, 00:22:45.255 "process": { 00:22:45.255 "type": "rebuild", 00:22:45.255 "target": "spare", 00:22:45.255 "progress": { 00:22:45.255 "blocks": 34816, 00:22:45.255 "percent": 53 00:22:45.255 } 00:22:45.255 }, 00:22:45.255 "base_bdevs_list": [ 00:22:45.255 { 00:22:45.255 "name": "spare", 00:22:45.255 "uuid": "4cf03b54-633b-5c20-b61b-3f263157fd37", 00:22:45.255 "is_configured": true, 00:22:45.255 "data_offset": 0, 00:22:45.255 "data_size": 65536 00:22:45.255 }, 00:22:45.255 { 00:22:45.255 "name": null, 00:22:45.255 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.255 "is_configured": false, 00:22:45.255 "data_offset": 0, 00:22:45.255 "data_size": 65536 00:22:45.255 }, 00:22:45.255 { 00:22:45.255 "name": "BaseBdev3", 00:22:45.255 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:45.255 "is_configured": true, 00:22:45.255 "data_offset": 0, 00:22:45.255 "data_size": 65536 00:22:45.255 }, 00:22:45.255 { 00:22:45.255 "name": "BaseBdev4", 00:22:45.255 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:45.255 "is_configured": true, 00:22:45.255 "data_offset": 0, 00:22:45.255 "data_size": 65536 00:22:45.255 } 00:22:45.255 ] 00:22:45.255 }' 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=775 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.255 07:58:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.518 07:58:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:45.518 "name": "raid_bdev1", 00:22:45.518 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:45.518 "strip_size_kb": 0, 00:22:45.518 "state": "online", 00:22:45.518 "raid_level": "raid1", 00:22:45.518 "superblock": false, 00:22:45.518 "num_base_bdevs": 4, 00:22:45.518 "num_base_bdevs_discovered": 3, 00:22:45.518 "num_base_bdevs_operational": 3, 00:22:45.518 "process": { 00:22:45.518 "type": "rebuild", 00:22:45.518 "target": "spare", 00:22:45.518 "progress": { 00:22:45.518 "blocks": 38912, 00:22:45.518 "percent": 59 00:22:45.518 } 00:22:45.518 }, 00:22:45.518 "base_bdevs_list": [ 00:22:45.518 { 00:22:45.518 "name": "spare", 00:22:45.518 "uuid": "4cf03b54-633b-5c20-b61b-3f263157fd37", 00:22:45.518 "is_configured": true, 00:22:45.518 "data_offset": 0, 00:22:45.518 "data_size": 65536 00:22:45.518 }, 00:22:45.518 { 00:22:45.518 "name": null, 00:22:45.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.518 "is_configured": false, 00:22:45.518 "data_offset": 0, 00:22:45.518 "data_size": 65536 00:22:45.518 }, 00:22:45.518 { 00:22:45.518 "name": "BaseBdev3", 00:22:45.518 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:45.518 "is_configured": true, 00:22:45.518 "data_offset": 0, 00:22:45.518 "data_size": 65536 00:22:45.518 }, 00:22:45.518 { 00:22:45.518 "name": "BaseBdev4", 00:22:45.518 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:45.518 "is_configured": true, 00:22:45.518 "data_offset": 0, 00:22:45.518 "data_size": 65536 00:22:45.518 } 00:22:45.518 ] 00:22:45.518 }' 00:22:45.518 07:58:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:45.518 07:58:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:45.518 07:58:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:45.518 07:58:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:45.518 07:58:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:46.457 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:46.457 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.457 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.457 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.457 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.457 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.717 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.717 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.717 [2024-07-15 07:58:31.312880] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:46.717 [2024-07-15 07:58:31.312923] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:46.717 [2024-07-15 07:58:31.312951] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:46.717 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.717 "name": "raid_bdev1", 00:22:46.717 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:46.717 "strip_size_kb": 0, 00:22:46.717 "state": "online", 00:22:46.717 "raid_level": "raid1", 00:22:46.717 "superblock": false, 00:22:46.717 "num_base_bdevs": 4, 00:22:46.717 "num_base_bdevs_discovered": 3, 00:22:46.717 "num_base_bdevs_operational": 3, 00:22:46.717 "base_bdevs_list": [ 00:22:46.717 { 00:22:46.717 "name": "spare", 00:22:46.717 "uuid": "4cf03b54-633b-5c20-b61b-3f263157fd37", 00:22:46.717 "is_configured": true, 00:22:46.717 "data_offset": 0, 00:22:46.717 "data_size": 65536 00:22:46.717 }, 00:22:46.717 { 00:22:46.717 "name": null, 00:22:46.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.717 "is_configured": false, 00:22:46.717 "data_offset": 0, 00:22:46.717 "data_size": 65536 00:22:46.717 }, 00:22:46.717 { 00:22:46.717 "name": "BaseBdev3", 00:22:46.717 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:46.717 "is_configured": true, 00:22:46.717 "data_offset": 0, 00:22:46.717 "data_size": 65536 00:22:46.717 }, 00:22:46.717 { 00:22:46.717 "name": "BaseBdev4", 00:22:46.717 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:46.717 "is_configured": true, 00:22:46.717 "data_offset": 0, 00:22:46.717 "data_size": 65536 00:22:46.717 } 00:22:46.717 ] 00:22:46.717 }' 00:22:46.717 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.717 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:46.717 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.976 "name": "raid_bdev1", 00:22:46.976 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:46.976 "strip_size_kb": 0, 00:22:46.976 "state": "online", 00:22:46.976 "raid_level": "raid1", 00:22:46.976 "superblock": false, 00:22:46.976 "num_base_bdevs": 4, 00:22:46.976 "num_base_bdevs_discovered": 3, 00:22:46.976 "num_base_bdevs_operational": 3, 00:22:46.976 "base_bdevs_list": [ 00:22:46.976 { 00:22:46.976 "name": "spare", 00:22:46.976 "uuid": "4cf03b54-633b-5c20-b61b-3f263157fd37", 00:22:46.976 "is_configured": true, 00:22:46.976 "data_offset": 0, 00:22:46.976 "data_size": 65536 00:22:46.976 }, 00:22:46.976 { 00:22:46.976 "name": null, 00:22:46.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.976 "is_configured": false, 00:22:46.976 "data_offset": 0, 00:22:46.976 "data_size": 65536 00:22:46.976 }, 00:22:46.976 { 00:22:46.976 "name": "BaseBdev3", 00:22:46.976 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:46.976 "is_configured": true, 00:22:46.976 "data_offset": 0, 00:22:46.976 "data_size": 65536 00:22:46.976 }, 00:22:46.976 { 00:22:46.976 "name": "BaseBdev4", 00:22:46.976 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:46.976 "is_configured": true, 00:22:46.976 "data_offset": 0, 00:22:46.976 "data_size": 65536 00:22:46.976 } 00:22:46.976 ] 00:22:46.976 }' 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.976 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.235 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:47.235 "name": "raid_bdev1", 00:22:47.235 "uuid": "30a7caa1-076d-4291-b14c-f9961fe3f4a4", 00:22:47.235 "strip_size_kb": 0, 00:22:47.235 "state": "online", 00:22:47.235 "raid_level": "raid1", 00:22:47.235 "superblock": false, 00:22:47.235 "num_base_bdevs": 4, 00:22:47.235 "num_base_bdevs_discovered": 3, 00:22:47.235 "num_base_bdevs_operational": 3, 00:22:47.235 "base_bdevs_list": [ 00:22:47.235 { 00:22:47.236 "name": "spare", 00:22:47.236 "uuid": "4cf03b54-633b-5c20-b61b-3f263157fd37", 00:22:47.236 "is_configured": true, 00:22:47.236 "data_offset": 0, 00:22:47.236 "data_size": 65536 00:22:47.236 }, 00:22:47.236 { 00:22:47.236 "name": null, 00:22:47.236 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.236 "is_configured": false, 00:22:47.236 "data_offset": 0, 00:22:47.236 "data_size": 65536 00:22:47.236 }, 00:22:47.236 { 00:22:47.236 "name": "BaseBdev3", 00:22:47.236 "uuid": "c542808b-8b88-537a-9f4d-1e98c17d47e0", 00:22:47.236 "is_configured": true, 00:22:47.236 "data_offset": 0, 00:22:47.236 "data_size": 65536 00:22:47.236 }, 00:22:47.236 { 00:22:47.236 "name": "BaseBdev4", 00:22:47.236 "uuid": "469a766a-705d-5cbb-9267-9ea8b107708c", 00:22:47.236 "is_configured": true, 00:22:47.236 "data_offset": 0, 00:22:47.236 "data_size": 65536 00:22:47.236 } 00:22:47.236 ] 00:22:47.236 }' 00:22:47.236 07:58:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:47.236 07:58:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:47.803 07:58:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:48.061 [2024-07-15 07:58:32.687939] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:48.061 [2024-07-15 07:58:32.687957] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:48.061 [2024-07-15 07:58:32.687998] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:48.061 [2024-07-15 07:58:32.688053] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:48.062 [2024-07-15 07:58:32.688060] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17dbb30 name raid_bdev1, state offline 00:22:48.062 07:58:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.062 07:58:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:48.321 07:58:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:48.581 /dev/nbd0 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:48.581 1+0 records in 00:22:48.581 1+0 records out 00:22:48.581 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278551 s, 14.7 MB/s 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:48.581 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:48.581 /dev/nbd1 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:48.840 1+0 records in 00:22:48.840 1+0 records out 00:22:48.840 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231598 s, 17.7 MB/s 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:48.840 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1722236 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1722236 ']' 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1722236 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:49.098 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1722236 00:22:49.357 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:49.357 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:49.357 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1722236' 00:22:49.357 killing process with pid 1722236 00:22:49.357 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1722236 00:22:49.357 Received shutdown signal, test time was about 60.000000 seconds 00:22:49.357 00:22:49.357 Latency(us) 00:22:49.357 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:49.357 =================================================================================================================== 00:22:49.357 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:49.357 [2024-07-15 07:58:33.877566] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:49.357 07:58:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1722236 00:22:49.357 [2024-07-15 07:58:33.903625] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:22:49.357 00:22:49.357 real 0m24.896s 00:22:49.357 user 0m31.865s 00:22:49.357 sys 0m4.590s 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:49.357 ************************************ 00:22:49.357 END TEST raid_rebuild_test 00:22:49.357 ************************************ 00:22:49.357 07:58:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:49.357 07:58:34 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:22:49.357 07:58:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:49.357 07:58:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:49.357 07:58:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:49.357 ************************************ 00:22:49.357 START TEST raid_rebuild_test_sb 00:22:49.357 ************************************ 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:49.357 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1726472 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1726472 /var/tmp/spdk-raid.sock 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1726472 ']' 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:49.358 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:49.358 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:49.618 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:49.618 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:49.618 [2024-07-15 07:58:34.165847] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:22:49.618 [2024-07-15 07:58:34.165913] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1726472 ] 00:22:49.618 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:49.618 Zero copy mechanism will not be used. 00:22:49.618 [2024-07-15 07:58:34.246650] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:49.618 [2024-07-15 07:58:34.313353] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:49.618 [2024-07-15 07:58:34.351512] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:49.618 [2024-07-15 07:58:34.351536] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:50.558 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:50.558 07:58:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:50.558 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:50.558 07:58:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:50.558 BaseBdev1_malloc 00:22:50.558 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:50.558 [2024-07-15 07:58:35.257174] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:50.558 [2024-07-15 07:58:35.257210] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:50.558 [2024-07-15 07:58:35.257222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16dbd30 00:22:50.558 [2024-07-15 07:58:35.257228] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:50.558 [2024-07-15 07:58:35.258480] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:50.558 [2024-07-15 07:58:35.258499] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:50.558 BaseBdev1 00:22:50.558 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:50.558 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:50.819 BaseBdev2_malloc 00:22:50.819 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:50.819 [2024-07-15 07:58:35.559637] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:50.819 [2024-07-15 07:58:35.559666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:50.819 [2024-07-15 07:58:35.559676] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x188ec60 00:22:50.819 [2024-07-15 07:58:35.559682] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:50.819 [2024-07-15 07:58:35.560832] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:50.819 [2024-07-15 07:58:35.560850] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:50.819 BaseBdev2 00:22:51.079 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:51.079 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:51.079 BaseBdev3_malloc 00:22:51.079 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:51.340 [2024-07-15 07:58:35.906054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:51.340 [2024-07-15 07:58:35.906079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:51.340 [2024-07-15 07:58:35.906088] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1873b90 00:22:51.340 [2024-07-15 07:58:35.906095] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:51.340 [2024-07-15 07:58:35.907219] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:51.340 [2024-07-15 07:58:35.907236] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:51.340 BaseBdev3 00:22:51.340 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:51.340 07:58:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:51.601 BaseBdev4_malloc 00:22:51.601 07:58:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:51.601 [2024-07-15 07:58:36.276582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:51.601 [2024-07-15 07:58:36.276607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:51.601 [2024-07-15 07:58:36.276617] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16dc8c0 00:22:51.601 [2024-07-15 07:58:36.276623] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:51.601 [2024-07-15 07:58:36.277749] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:51.601 [2024-07-15 07:58:36.277767] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:51.601 BaseBdev4 00:22:51.601 07:58:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:51.861 spare_malloc 00:22:51.861 07:58:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:52.121 spare_delay 00:22:52.121 07:58:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:52.121 [2024-07-15 07:58:36.815525] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:52.121 [2024-07-15 07:58:36.815549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.121 [2024-07-15 07:58:36.815558] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d4a80 00:22:52.121 [2024-07-15 07:58:36.815565] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.121 [2024-07-15 07:58:36.816718] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.121 [2024-07-15 07:58:36.816736] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:52.121 spare 00:22:52.121 07:58:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:52.692 [2024-07-15 07:58:37.344877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:52.692 [2024-07-15 07:58:37.345862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:52.692 [2024-07-15 07:58:37.345904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:52.692 [2024-07-15 07:58:37.345936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:52.692 [2024-07-15 07:58:37.346075] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16d5b30 00:22:52.692 [2024-07-15 07:58:37.346082] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:52.692 [2024-07-15 07:58:37.346230] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16d32e0 00:22:52.692 [2024-07-15 07:58:37.346344] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16d5b30 00:22:52.692 [2024-07-15 07:58:37.346349] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16d5b30 00:22:52.692 [2024-07-15 07:58:37.346417] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.692 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.952 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.952 "name": "raid_bdev1", 00:22:52.952 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:22:52.952 "strip_size_kb": 0, 00:22:52.952 "state": "online", 00:22:52.952 "raid_level": "raid1", 00:22:52.952 "superblock": true, 00:22:52.952 "num_base_bdevs": 4, 00:22:52.952 "num_base_bdevs_discovered": 4, 00:22:52.952 "num_base_bdevs_operational": 4, 00:22:52.952 "base_bdevs_list": [ 00:22:52.952 { 00:22:52.952 "name": "BaseBdev1", 00:22:52.952 "uuid": "0ea905c5-72f4-5f66-acd7-52482de81254", 00:22:52.952 "is_configured": true, 00:22:52.952 "data_offset": 2048, 00:22:52.952 "data_size": 63488 00:22:52.952 }, 00:22:52.952 { 00:22:52.952 "name": "BaseBdev2", 00:22:52.952 "uuid": "ccf4c321-dedd-56c2-b938-5fee5432618c", 00:22:52.952 "is_configured": true, 00:22:52.952 "data_offset": 2048, 00:22:52.952 "data_size": 63488 00:22:52.952 }, 00:22:52.952 { 00:22:52.952 "name": "BaseBdev3", 00:22:52.952 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:22:52.952 "is_configured": true, 00:22:52.952 "data_offset": 2048, 00:22:52.952 "data_size": 63488 00:22:52.952 }, 00:22:52.952 { 00:22:52.952 "name": "BaseBdev4", 00:22:52.952 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:22:52.952 "is_configured": true, 00:22:52.952 "data_offset": 2048, 00:22:52.952 "data_size": 63488 00:22:52.952 } 00:22:52.952 ] 00:22:52.952 }' 00:22:52.952 07:58:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.952 07:58:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:53.522 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:53.522 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:53.782 [2024-07-15 07:58:38.307532] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:53.782 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:54.042 [2024-07-15 07:58:38.688281] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16daa60 00:22:54.042 /dev/nbd0 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:54.042 1+0 records in 00:22:54.042 1+0 records out 00:22:54.042 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285521 s, 14.3 MB/s 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:54.042 07:58:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:04.033 63488+0 records in 00:23:04.033 63488+0 records out 00:23:04.033 32505856 bytes (33 MB, 31 MiB) copied, 9.18298 s, 3.5 MB/s 00:23:04.033 07:58:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:04.033 07:58:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:04.033 07:58:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:04.033 07:58:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:04.033 07:58:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:04.033 07:58:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:04.033 07:58:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:04.033 [2024-07-15 07:58:48.093370] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:04.033 [2024-07-15 07:58:48.272125] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.033 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.033 "name": "raid_bdev1", 00:23:04.033 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:04.033 "strip_size_kb": 0, 00:23:04.033 "state": "online", 00:23:04.033 "raid_level": "raid1", 00:23:04.033 "superblock": true, 00:23:04.033 "num_base_bdevs": 4, 00:23:04.033 "num_base_bdevs_discovered": 3, 00:23:04.033 "num_base_bdevs_operational": 3, 00:23:04.033 "base_bdevs_list": [ 00:23:04.033 { 00:23:04.033 "name": null, 00:23:04.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.033 "is_configured": false, 00:23:04.034 "data_offset": 2048, 00:23:04.034 "data_size": 63488 00:23:04.034 }, 00:23:04.034 { 00:23:04.034 "name": "BaseBdev2", 00:23:04.034 "uuid": "ccf4c321-dedd-56c2-b938-5fee5432618c", 00:23:04.034 "is_configured": true, 00:23:04.034 "data_offset": 2048, 00:23:04.034 "data_size": 63488 00:23:04.034 }, 00:23:04.034 { 00:23:04.034 "name": "BaseBdev3", 00:23:04.034 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:04.034 "is_configured": true, 00:23:04.034 "data_offset": 2048, 00:23:04.034 "data_size": 63488 00:23:04.034 }, 00:23:04.034 { 00:23:04.034 "name": "BaseBdev4", 00:23:04.034 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:04.034 "is_configured": true, 00:23:04.034 "data_offset": 2048, 00:23:04.034 "data_size": 63488 00:23:04.034 } 00:23:04.034 ] 00:23:04.034 }' 00:23:04.034 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.034 07:58:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:04.293 07:58:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:04.553 [2024-07-15 07:58:49.174402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:04.553 [2024-07-15 07:58:49.177115] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16dae80 00:23:04.553 [2024-07-15 07:58:49.178717] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:04.553 07:58:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:05.532 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:05.532 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:05.532 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:05.532 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:05.532 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:05.532 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.532 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.791 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:05.791 "name": "raid_bdev1", 00:23:05.792 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:05.792 "strip_size_kb": 0, 00:23:05.792 "state": "online", 00:23:05.792 "raid_level": "raid1", 00:23:05.792 "superblock": true, 00:23:05.792 "num_base_bdevs": 4, 00:23:05.792 "num_base_bdevs_discovered": 4, 00:23:05.792 "num_base_bdevs_operational": 4, 00:23:05.792 "process": { 00:23:05.792 "type": "rebuild", 00:23:05.792 "target": "spare", 00:23:05.792 "progress": { 00:23:05.792 "blocks": 22528, 00:23:05.792 "percent": 35 00:23:05.792 } 00:23:05.792 }, 00:23:05.792 "base_bdevs_list": [ 00:23:05.792 { 00:23:05.792 "name": "spare", 00:23:05.792 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:05.792 "is_configured": true, 00:23:05.792 "data_offset": 2048, 00:23:05.792 "data_size": 63488 00:23:05.792 }, 00:23:05.792 { 00:23:05.792 "name": "BaseBdev2", 00:23:05.792 "uuid": "ccf4c321-dedd-56c2-b938-5fee5432618c", 00:23:05.792 "is_configured": true, 00:23:05.792 "data_offset": 2048, 00:23:05.792 "data_size": 63488 00:23:05.792 }, 00:23:05.792 { 00:23:05.792 "name": "BaseBdev3", 00:23:05.792 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:05.792 "is_configured": true, 00:23:05.792 "data_offset": 2048, 00:23:05.792 "data_size": 63488 00:23:05.792 }, 00:23:05.792 { 00:23:05.792 "name": "BaseBdev4", 00:23:05.792 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:05.792 "is_configured": true, 00:23:05.792 "data_offset": 2048, 00:23:05.792 "data_size": 63488 00:23:05.792 } 00:23:05.792 ] 00:23:05.792 }' 00:23:05.792 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:05.792 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:05.792 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:05.792 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:05.792 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:06.051 [2024-07-15 07:58:50.659442] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:06.051 [2024-07-15 07:58:50.687479] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:06.051 [2024-07-15 07:58:50.687511] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.051 [2024-07-15 07:58:50.687522] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:06.051 [2024-07-15 07:58:50.687526] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.051 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.310 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.310 "name": "raid_bdev1", 00:23:06.310 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:06.310 "strip_size_kb": 0, 00:23:06.310 "state": "online", 00:23:06.310 "raid_level": "raid1", 00:23:06.310 "superblock": true, 00:23:06.310 "num_base_bdevs": 4, 00:23:06.310 "num_base_bdevs_discovered": 3, 00:23:06.310 "num_base_bdevs_operational": 3, 00:23:06.310 "base_bdevs_list": [ 00:23:06.310 { 00:23:06.310 "name": null, 00:23:06.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.310 "is_configured": false, 00:23:06.310 "data_offset": 2048, 00:23:06.310 "data_size": 63488 00:23:06.310 }, 00:23:06.310 { 00:23:06.310 "name": "BaseBdev2", 00:23:06.310 "uuid": "ccf4c321-dedd-56c2-b938-5fee5432618c", 00:23:06.310 "is_configured": true, 00:23:06.310 "data_offset": 2048, 00:23:06.310 "data_size": 63488 00:23:06.310 }, 00:23:06.310 { 00:23:06.310 "name": "BaseBdev3", 00:23:06.310 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:06.310 "is_configured": true, 00:23:06.310 "data_offset": 2048, 00:23:06.310 "data_size": 63488 00:23:06.310 }, 00:23:06.310 { 00:23:06.310 "name": "BaseBdev4", 00:23:06.310 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:06.310 "is_configured": true, 00:23:06.310 "data_offset": 2048, 00:23:06.310 "data_size": 63488 00:23:06.310 } 00:23:06.310 ] 00:23:06.310 }' 00:23:06.310 07:58:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.310 07:58:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:06.879 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:06.879 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:06.879 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:06.879 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:06.879 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:06.879 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.879 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:07.139 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:07.139 "name": "raid_bdev1", 00:23:07.139 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:07.139 "strip_size_kb": 0, 00:23:07.139 "state": "online", 00:23:07.139 "raid_level": "raid1", 00:23:07.139 "superblock": true, 00:23:07.139 "num_base_bdevs": 4, 00:23:07.139 "num_base_bdevs_discovered": 3, 00:23:07.139 "num_base_bdevs_operational": 3, 00:23:07.139 "base_bdevs_list": [ 00:23:07.139 { 00:23:07.139 "name": null, 00:23:07.139 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:07.139 "is_configured": false, 00:23:07.139 "data_offset": 2048, 00:23:07.139 "data_size": 63488 00:23:07.139 }, 00:23:07.139 { 00:23:07.139 "name": "BaseBdev2", 00:23:07.139 "uuid": "ccf4c321-dedd-56c2-b938-5fee5432618c", 00:23:07.139 "is_configured": true, 00:23:07.139 "data_offset": 2048, 00:23:07.139 "data_size": 63488 00:23:07.139 }, 00:23:07.139 { 00:23:07.139 "name": "BaseBdev3", 00:23:07.139 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:07.139 "is_configured": true, 00:23:07.139 "data_offset": 2048, 00:23:07.139 "data_size": 63488 00:23:07.139 }, 00:23:07.139 { 00:23:07.139 "name": "BaseBdev4", 00:23:07.139 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:07.139 "is_configured": true, 00:23:07.139 "data_offset": 2048, 00:23:07.139 "data_size": 63488 00:23:07.139 } 00:23:07.139 ] 00:23:07.139 }' 00:23:07.139 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:07.139 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:07.139 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:07.139 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:07.139 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:07.399 [2024-07-15 07:58:51.950734] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:07.399 [2024-07-15 07:58:51.953544] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1875890 00:23:07.399 [2024-07-15 07:58:51.954715] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:07.399 07:58:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:08.338 07:58:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:08.339 07:58:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:08.339 07:58:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:08.339 07:58:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:08.339 07:58:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:08.339 07:58:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.339 07:58:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.598 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:08.598 "name": "raid_bdev1", 00:23:08.598 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:08.598 "strip_size_kb": 0, 00:23:08.598 "state": "online", 00:23:08.598 "raid_level": "raid1", 00:23:08.598 "superblock": true, 00:23:08.598 "num_base_bdevs": 4, 00:23:08.598 "num_base_bdevs_discovered": 4, 00:23:08.598 "num_base_bdevs_operational": 4, 00:23:08.598 "process": { 00:23:08.598 "type": "rebuild", 00:23:08.598 "target": "spare", 00:23:08.598 "progress": { 00:23:08.598 "blocks": 22528, 00:23:08.598 "percent": 35 00:23:08.598 } 00:23:08.598 }, 00:23:08.598 "base_bdevs_list": [ 00:23:08.598 { 00:23:08.598 "name": "spare", 00:23:08.598 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:08.598 "is_configured": true, 00:23:08.598 "data_offset": 2048, 00:23:08.598 "data_size": 63488 00:23:08.598 }, 00:23:08.598 { 00:23:08.598 "name": "BaseBdev2", 00:23:08.598 "uuid": "ccf4c321-dedd-56c2-b938-5fee5432618c", 00:23:08.598 "is_configured": true, 00:23:08.598 "data_offset": 2048, 00:23:08.598 "data_size": 63488 00:23:08.598 }, 00:23:08.598 { 00:23:08.599 "name": "BaseBdev3", 00:23:08.599 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:08.599 "is_configured": true, 00:23:08.599 "data_offset": 2048, 00:23:08.599 "data_size": 63488 00:23:08.599 }, 00:23:08.599 { 00:23:08.599 "name": "BaseBdev4", 00:23:08.599 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:08.599 "is_configured": true, 00:23:08.599 "data_offset": 2048, 00:23:08.599 "data_size": 63488 00:23:08.599 } 00:23:08.599 ] 00:23:08.599 }' 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:08.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:08.599 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:08.858 [2024-07-15 07:58:53.403387] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:08.858 [2024-07-15 07:58:53.563795] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1875890 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.858 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.116 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:09.116 "name": "raid_bdev1", 00:23:09.116 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:09.116 "strip_size_kb": 0, 00:23:09.116 "state": "online", 00:23:09.116 "raid_level": "raid1", 00:23:09.116 "superblock": true, 00:23:09.116 "num_base_bdevs": 4, 00:23:09.116 "num_base_bdevs_discovered": 3, 00:23:09.116 "num_base_bdevs_operational": 3, 00:23:09.116 "process": { 00:23:09.116 "type": "rebuild", 00:23:09.116 "target": "spare", 00:23:09.116 "progress": { 00:23:09.116 "blocks": 32768, 00:23:09.116 "percent": 51 00:23:09.116 } 00:23:09.116 }, 00:23:09.116 "base_bdevs_list": [ 00:23:09.116 { 00:23:09.116 "name": "spare", 00:23:09.116 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:09.116 "is_configured": true, 00:23:09.116 "data_offset": 2048, 00:23:09.116 "data_size": 63488 00:23:09.116 }, 00:23:09.116 { 00:23:09.116 "name": null, 00:23:09.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.116 "is_configured": false, 00:23:09.116 "data_offset": 2048, 00:23:09.116 "data_size": 63488 00:23:09.116 }, 00:23:09.116 { 00:23:09.116 "name": "BaseBdev3", 00:23:09.116 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:09.116 "is_configured": true, 00:23:09.116 "data_offset": 2048, 00:23:09.116 "data_size": 63488 00:23:09.116 }, 00:23:09.116 { 00:23:09.116 "name": "BaseBdev4", 00:23:09.116 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:09.116 "is_configured": true, 00:23:09.116 "data_offset": 2048, 00:23:09.116 "data_size": 63488 00:23:09.116 } 00:23:09.116 ] 00:23:09.116 }' 00:23:09.117 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:09.117 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:09.117 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:09.117 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=799 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.376 07:58:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.376 07:58:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:09.376 "name": "raid_bdev1", 00:23:09.376 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:09.376 "strip_size_kb": 0, 00:23:09.376 "state": "online", 00:23:09.376 "raid_level": "raid1", 00:23:09.376 "superblock": true, 00:23:09.376 "num_base_bdevs": 4, 00:23:09.376 "num_base_bdevs_discovered": 3, 00:23:09.376 "num_base_bdevs_operational": 3, 00:23:09.376 "process": { 00:23:09.376 "type": "rebuild", 00:23:09.376 "target": "spare", 00:23:09.376 "progress": { 00:23:09.376 "blocks": 38912, 00:23:09.376 "percent": 61 00:23:09.376 } 00:23:09.376 }, 00:23:09.376 "base_bdevs_list": [ 00:23:09.376 { 00:23:09.376 "name": "spare", 00:23:09.376 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:09.376 "is_configured": true, 00:23:09.376 "data_offset": 2048, 00:23:09.376 "data_size": 63488 00:23:09.376 }, 00:23:09.376 { 00:23:09.376 "name": null, 00:23:09.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.376 "is_configured": false, 00:23:09.376 "data_offset": 2048, 00:23:09.376 "data_size": 63488 00:23:09.376 }, 00:23:09.376 { 00:23:09.376 "name": "BaseBdev3", 00:23:09.376 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:09.376 "is_configured": true, 00:23:09.376 "data_offset": 2048, 00:23:09.376 "data_size": 63488 00:23:09.376 }, 00:23:09.376 { 00:23:09.376 "name": "BaseBdev4", 00:23:09.376 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:09.376 "is_configured": true, 00:23:09.376 "data_offset": 2048, 00:23:09.376 "data_size": 63488 00:23:09.376 } 00:23:09.376 ] 00:23:09.376 }' 00:23:09.376 07:58:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:09.376 07:58:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:09.376 07:58:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:09.636 07:58:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:09.636 07:58:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:10.574 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:10.574 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:10.574 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:10.574 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:10.574 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:10.574 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:10.574 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.574 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.574 [2024-07-15 07:58:55.173172] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:10.574 [2024-07-15 07:58:55.173216] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:10.574 [2024-07-15 07:58:55.173292] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:10.833 "name": "raid_bdev1", 00:23:10.833 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:10.833 "strip_size_kb": 0, 00:23:10.833 "state": "online", 00:23:10.833 "raid_level": "raid1", 00:23:10.833 "superblock": true, 00:23:10.833 "num_base_bdevs": 4, 00:23:10.833 "num_base_bdevs_discovered": 3, 00:23:10.833 "num_base_bdevs_operational": 3, 00:23:10.833 "base_bdevs_list": [ 00:23:10.833 { 00:23:10.833 "name": "spare", 00:23:10.833 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:10.833 "is_configured": true, 00:23:10.833 "data_offset": 2048, 00:23:10.833 "data_size": 63488 00:23:10.833 }, 00:23:10.833 { 00:23:10.833 "name": null, 00:23:10.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.833 "is_configured": false, 00:23:10.833 "data_offset": 2048, 00:23:10.833 "data_size": 63488 00:23:10.833 }, 00:23:10.833 { 00:23:10.833 "name": "BaseBdev3", 00:23:10.833 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:10.833 "is_configured": true, 00:23:10.833 "data_offset": 2048, 00:23:10.833 "data_size": 63488 00:23:10.833 }, 00:23:10.833 { 00:23:10.833 "name": "BaseBdev4", 00:23:10.833 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:10.833 "is_configured": true, 00:23:10.833 "data_offset": 2048, 00:23:10.833 "data_size": 63488 00:23:10.833 } 00:23:10.833 ] 00:23:10.833 }' 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.833 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:11.092 "name": "raid_bdev1", 00:23:11.092 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:11.092 "strip_size_kb": 0, 00:23:11.092 "state": "online", 00:23:11.092 "raid_level": "raid1", 00:23:11.092 "superblock": true, 00:23:11.092 "num_base_bdevs": 4, 00:23:11.092 "num_base_bdevs_discovered": 3, 00:23:11.092 "num_base_bdevs_operational": 3, 00:23:11.092 "base_bdevs_list": [ 00:23:11.092 { 00:23:11.092 "name": "spare", 00:23:11.092 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:11.092 "is_configured": true, 00:23:11.092 "data_offset": 2048, 00:23:11.092 "data_size": 63488 00:23:11.092 }, 00:23:11.092 { 00:23:11.092 "name": null, 00:23:11.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.092 "is_configured": false, 00:23:11.092 "data_offset": 2048, 00:23:11.092 "data_size": 63488 00:23:11.092 }, 00:23:11.092 { 00:23:11.092 "name": "BaseBdev3", 00:23:11.092 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:11.092 "is_configured": true, 00:23:11.092 "data_offset": 2048, 00:23:11.092 "data_size": 63488 00:23:11.092 }, 00:23:11.092 { 00:23:11.092 "name": "BaseBdev4", 00:23:11.092 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:11.092 "is_configured": true, 00:23:11.092 "data_offset": 2048, 00:23:11.092 "data_size": 63488 00:23:11.092 } 00:23:11.092 ] 00:23:11.092 }' 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.092 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.093 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:11.093 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.093 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.093 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.093 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.093 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.093 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.352 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.352 "name": "raid_bdev1", 00:23:11.352 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:11.352 "strip_size_kb": 0, 00:23:11.352 "state": "online", 00:23:11.352 "raid_level": "raid1", 00:23:11.352 "superblock": true, 00:23:11.352 "num_base_bdevs": 4, 00:23:11.352 "num_base_bdevs_discovered": 3, 00:23:11.352 "num_base_bdevs_operational": 3, 00:23:11.352 "base_bdevs_list": [ 00:23:11.352 { 00:23:11.352 "name": "spare", 00:23:11.352 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:11.352 "is_configured": true, 00:23:11.352 "data_offset": 2048, 00:23:11.352 "data_size": 63488 00:23:11.352 }, 00:23:11.352 { 00:23:11.352 "name": null, 00:23:11.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.352 "is_configured": false, 00:23:11.352 "data_offset": 2048, 00:23:11.352 "data_size": 63488 00:23:11.352 }, 00:23:11.352 { 00:23:11.352 "name": "BaseBdev3", 00:23:11.352 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:11.352 "is_configured": true, 00:23:11.352 "data_offset": 2048, 00:23:11.352 "data_size": 63488 00:23:11.352 }, 00:23:11.352 { 00:23:11.352 "name": "BaseBdev4", 00:23:11.352 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:11.352 "is_configured": true, 00:23:11.352 "data_offset": 2048, 00:23:11.352 "data_size": 63488 00:23:11.352 } 00:23:11.352 ] 00:23:11.352 }' 00:23:11.352 07:58:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.352 07:58:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:11.922 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:11.922 [2024-07-15 07:58:56.612365] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:11.922 [2024-07-15 07:58:56.612383] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:11.922 [2024-07-15 07:58:56.612423] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:11.922 [2024-07-15 07:58:56.612479] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:11.922 [2024-07-15 07:58:56.612485] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16d5b30 name raid_bdev1, state offline 00:23:11.922 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.922 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:12.182 07:58:56 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:12.441 /dev/nbd0 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:12.441 1+0 records in 00:23:12.441 1+0 records out 00:23:12.441 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310834 s, 13.2 MB/s 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:12.441 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:12.702 /dev/nbd1 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:12.702 1+0 records in 00:23:12.702 1+0 records out 00:23:12.702 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000202323 s, 20.2 MB/s 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:12.702 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:12.962 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:13.222 07:58:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:13.483 [2024-07-15 07:58:58.122448] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:13.483 [2024-07-15 07:58:58.122482] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:13.483 [2024-07-15 07:58:58.122493] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d5830 00:23:13.483 [2024-07-15 07:58:58.122500] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:13.483 [2024-07-15 07:58:58.123856] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:13.483 [2024-07-15 07:58:58.123876] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:13.483 [2024-07-15 07:58:58.123932] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:13.483 [2024-07-15 07:58:58.123953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:13.483 [2024-07-15 07:58:58.124038] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:13.483 [2024-07-15 07:58:58.124093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:13.483 spare 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.483 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.483 [2024-07-15 07:58:58.224383] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16d3070 00:23:13.483 [2024-07-15 07:58:58.224391] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:13.483 [2024-07-15 07:58:58.224534] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1882f60 00:23:13.483 [2024-07-15 07:58:58.224644] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16d3070 00:23:13.483 [2024-07-15 07:58:58.224650] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16d3070 00:23:13.483 [2024-07-15 07:58:58.224725] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:13.743 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.743 "name": "raid_bdev1", 00:23:13.743 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:13.743 "strip_size_kb": 0, 00:23:13.743 "state": "online", 00:23:13.743 "raid_level": "raid1", 00:23:13.743 "superblock": true, 00:23:13.743 "num_base_bdevs": 4, 00:23:13.743 "num_base_bdevs_discovered": 3, 00:23:13.743 "num_base_bdevs_operational": 3, 00:23:13.743 "base_bdevs_list": [ 00:23:13.743 { 00:23:13.743 "name": "spare", 00:23:13.743 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:13.743 "is_configured": true, 00:23:13.743 "data_offset": 2048, 00:23:13.743 "data_size": 63488 00:23:13.743 }, 00:23:13.743 { 00:23:13.743 "name": null, 00:23:13.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.743 "is_configured": false, 00:23:13.743 "data_offset": 2048, 00:23:13.743 "data_size": 63488 00:23:13.743 }, 00:23:13.743 { 00:23:13.743 "name": "BaseBdev3", 00:23:13.743 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:13.743 "is_configured": true, 00:23:13.743 "data_offset": 2048, 00:23:13.743 "data_size": 63488 00:23:13.743 }, 00:23:13.743 { 00:23:13.743 "name": "BaseBdev4", 00:23:13.743 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:13.743 "is_configured": true, 00:23:13.743 "data_offset": 2048, 00:23:13.743 "data_size": 63488 00:23:13.743 } 00:23:13.743 ] 00:23:13.743 }' 00:23:13.743 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.743 07:58:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:14.332 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:14.332 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.332 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:14.332 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:14.332 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.332 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.332 07:58:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.332 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.332 "name": "raid_bdev1", 00:23:14.332 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:14.332 "strip_size_kb": 0, 00:23:14.332 "state": "online", 00:23:14.332 "raid_level": "raid1", 00:23:14.332 "superblock": true, 00:23:14.332 "num_base_bdevs": 4, 00:23:14.332 "num_base_bdevs_discovered": 3, 00:23:14.332 "num_base_bdevs_operational": 3, 00:23:14.332 "base_bdevs_list": [ 00:23:14.332 { 00:23:14.332 "name": "spare", 00:23:14.332 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:14.332 "is_configured": true, 00:23:14.332 "data_offset": 2048, 00:23:14.332 "data_size": 63488 00:23:14.332 }, 00:23:14.332 { 00:23:14.333 "name": null, 00:23:14.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.333 "is_configured": false, 00:23:14.333 "data_offset": 2048, 00:23:14.333 "data_size": 63488 00:23:14.333 }, 00:23:14.333 { 00:23:14.333 "name": "BaseBdev3", 00:23:14.333 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:14.333 "is_configured": true, 00:23:14.333 "data_offset": 2048, 00:23:14.333 "data_size": 63488 00:23:14.333 }, 00:23:14.333 { 00:23:14.333 "name": "BaseBdev4", 00:23:14.333 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:14.333 "is_configured": true, 00:23:14.333 "data_offset": 2048, 00:23:14.333 "data_size": 63488 00:23:14.333 } 00:23:14.333 ] 00:23:14.333 }' 00:23:14.333 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.593 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:14.593 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.593 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:14.593 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.593 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:14.593 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.593 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:14.851 [2024-07-15 07:58:59.522072] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.852 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.111 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:15.111 "name": "raid_bdev1", 00:23:15.111 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:15.111 "strip_size_kb": 0, 00:23:15.111 "state": "online", 00:23:15.111 "raid_level": "raid1", 00:23:15.111 "superblock": true, 00:23:15.111 "num_base_bdevs": 4, 00:23:15.111 "num_base_bdevs_discovered": 2, 00:23:15.111 "num_base_bdevs_operational": 2, 00:23:15.111 "base_bdevs_list": [ 00:23:15.111 { 00:23:15.111 "name": null, 00:23:15.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.111 "is_configured": false, 00:23:15.111 "data_offset": 2048, 00:23:15.111 "data_size": 63488 00:23:15.111 }, 00:23:15.111 { 00:23:15.111 "name": null, 00:23:15.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.111 "is_configured": false, 00:23:15.111 "data_offset": 2048, 00:23:15.111 "data_size": 63488 00:23:15.111 }, 00:23:15.111 { 00:23:15.111 "name": "BaseBdev3", 00:23:15.111 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:15.111 "is_configured": true, 00:23:15.111 "data_offset": 2048, 00:23:15.111 "data_size": 63488 00:23:15.111 }, 00:23:15.111 { 00:23:15.111 "name": "BaseBdev4", 00:23:15.111 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:15.111 "is_configured": true, 00:23:15.111 "data_offset": 2048, 00:23:15.111 "data_size": 63488 00:23:15.111 } 00:23:15.111 ] 00:23:15.111 }' 00:23:15.111 07:58:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:15.111 07:58:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:15.680 07:59:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:15.680 [2024-07-15 07:59:00.396309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:15.680 [2024-07-15 07:59:00.396426] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:15.680 [2024-07-15 07:59:00.396434] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:15.680 [2024-07-15 07:59:00.396456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:15.680 [2024-07-15 07:59:00.399048] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1882a20 00:23:15.680 [2024-07-15 07:59:00.400630] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:15.680 07:59:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:17.060 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:17.060 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.060 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:17.060 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.061 "name": "raid_bdev1", 00:23:17.061 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:17.061 "strip_size_kb": 0, 00:23:17.061 "state": "online", 00:23:17.061 "raid_level": "raid1", 00:23:17.061 "superblock": true, 00:23:17.061 "num_base_bdevs": 4, 00:23:17.061 "num_base_bdevs_discovered": 3, 00:23:17.061 "num_base_bdevs_operational": 3, 00:23:17.061 "process": { 00:23:17.061 "type": "rebuild", 00:23:17.061 "target": "spare", 00:23:17.061 "progress": { 00:23:17.061 "blocks": 22528, 00:23:17.061 "percent": 35 00:23:17.061 } 00:23:17.061 }, 00:23:17.061 "base_bdevs_list": [ 00:23:17.061 { 00:23:17.061 "name": "spare", 00:23:17.061 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:17.061 "is_configured": true, 00:23:17.061 "data_offset": 2048, 00:23:17.061 "data_size": 63488 00:23:17.061 }, 00:23:17.061 { 00:23:17.061 "name": null, 00:23:17.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.061 "is_configured": false, 00:23:17.061 "data_offset": 2048, 00:23:17.061 "data_size": 63488 00:23:17.061 }, 00:23:17.061 { 00:23:17.061 "name": "BaseBdev3", 00:23:17.061 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:17.061 "is_configured": true, 00:23:17.061 "data_offset": 2048, 00:23:17.061 "data_size": 63488 00:23:17.061 }, 00:23:17.061 { 00:23:17.061 "name": "BaseBdev4", 00:23:17.061 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:17.061 "is_configured": true, 00:23:17.061 "data_offset": 2048, 00:23:17.061 "data_size": 63488 00:23:17.061 } 00:23:17.061 ] 00:23:17.061 }' 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.061 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:17.061 [2024-07-15 07:59:01.804889] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:17.061 [2024-07-15 07:59:01.808854] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:17.061 [2024-07-15 07:59:01.808882] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:17.061 [2024-07-15 07:59:01.808892] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:17.061 [2024-07-15 07:59:01.808896] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.321 "name": "raid_bdev1", 00:23:17.321 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:17.321 "strip_size_kb": 0, 00:23:17.321 "state": "online", 00:23:17.321 "raid_level": "raid1", 00:23:17.321 "superblock": true, 00:23:17.321 "num_base_bdevs": 4, 00:23:17.321 "num_base_bdevs_discovered": 2, 00:23:17.321 "num_base_bdevs_operational": 2, 00:23:17.321 "base_bdevs_list": [ 00:23:17.321 { 00:23:17.321 "name": null, 00:23:17.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.321 "is_configured": false, 00:23:17.321 "data_offset": 2048, 00:23:17.321 "data_size": 63488 00:23:17.321 }, 00:23:17.321 { 00:23:17.321 "name": null, 00:23:17.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.321 "is_configured": false, 00:23:17.321 "data_offset": 2048, 00:23:17.321 "data_size": 63488 00:23:17.321 }, 00:23:17.321 { 00:23:17.321 "name": "BaseBdev3", 00:23:17.321 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:17.321 "is_configured": true, 00:23:17.321 "data_offset": 2048, 00:23:17.321 "data_size": 63488 00:23:17.321 }, 00:23:17.321 { 00:23:17.321 "name": "BaseBdev4", 00:23:17.321 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:17.321 "is_configured": true, 00:23:17.321 "data_offset": 2048, 00:23:17.321 "data_size": 63488 00:23:17.321 } 00:23:17.321 ] 00:23:17.321 }' 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.321 07:59:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:17.892 07:59:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:18.152 [2024-07-15 07:59:02.687088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:18.152 [2024-07-15 07:59:02.687120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:18.152 [2024-07-15 07:59:02.687133] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1882590 00:23:18.152 [2024-07-15 07:59:02.687140] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:18.152 [2024-07-15 07:59:02.687444] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:18.152 [2024-07-15 07:59:02.687455] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:18.152 [2024-07-15 07:59:02.687515] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:18.152 [2024-07-15 07:59:02.687522] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:23:18.152 [2024-07-15 07:59:02.687528] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:18.152 [2024-07-15 07:59:02.687538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:18.152 [2024-07-15 07:59:02.690072] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1874530 00:23:18.152 [2024-07-15 07:59:02.691253] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:18.152 spare 00:23:18.152 07:59:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:19.091 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:19.091 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.091 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:19.091 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:19.091 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.091 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.091 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.351 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:19.351 "name": "raid_bdev1", 00:23:19.351 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:19.351 "strip_size_kb": 0, 00:23:19.351 "state": "online", 00:23:19.351 "raid_level": "raid1", 00:23:19.351 "superblock": true, 00:23:19.351 "num_base_bdevs": 4, 00:23:19.351 "num_base_bdevs_discovered": 3, 00:23:19.351 "num_base_bdevs_operational": 3, 00:23:19.351 "process": { 00:23:19.351 "type": "rebuild", 00:23:19.351 "target": "spare", 00:23:19.351 "progress": { 00:23:19.351 "blocks": 22528, 00:23:19.351 "percent": 35 00:23:19.351 } 00:23:19.351 }, 00:23:19.351 "base_bdevs_list": [ 00:23:19.351 { 00:23:19.351 "name": "spare", 00:23:19.351 "uuid": "01ff5f6a-ec13-5eae-88cf-268d7cfac17b", 00:23:19.351 "is_configured": true, 00:23:19.351 "data_offset": 2048, 00:23:19.351 "data_size": 63488 00:23:19.351 }, 00:23:19.351 { 00:23:19.351 "name": null, 00:23:19.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.351 "is_configured": false, 00:23:19.351 "data_offset": 2048, 00:23:19.351 "data_size": 63488 00:23:19.351 }, 00:23:19.351 { 00:23:19.351 "name": "BaseBdev3", 00:23:19.351 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:19.351 "is_configured": true, 00:23:19.351 "data_offset": 2048, 00:23:19.351 "data_size": 63488 00:23:19.351 }, 00:23:19.351 { 00:23:19.351 "name": "BaseBdev4", 00:23:19.351 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:19.351 "is_configured": true, 00:23:19.351 "data_offset": 2048, 00:23:19.351 "data_size": 63488 00:23:19.351 } 00:23:19.351 ] 00:23:19.351 }' 00:23:19.351 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:19.351 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:19.351 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:19.351 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:19.351 07:59:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:19.611 [2024-07-15 07:59:04.139558] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:19.611 [2024-07-15 07:59:04.199963] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:19.611 [2024-07-15 07:59:04.199992] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:19.611 [2024-07-15 07:59:04.200002] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:19.611 [2024-07-15 07:59:04.200006] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.611 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.870 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.870 "name": "raid_bdev1", 00:23:19.870 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:19.870 "strip_size_kb": 0, 00:23:19.870 "state": "online", 00:23:19.870 "raid_level": "raid1", 00:23:19.870 "superblock": true, 00:23:19.870 "num_base_bdevs": 4, 00:23:19.870 "num_base_bdevs_discovered": 2, 00:23:19.870 "num_base_bdevs_operational": 2, 00:23:19.870 "base_bdevs_list": [ 00:23:19.870 { 00:23:19.870 "name": null, 00:23:19.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.870 "is_configured": false, 00:23:19.870 "data_offset": 2048, 00:23:19.870 "data_size": 63488 00:23:19.870 }, 00:23:19.870 { 00:23:19.870 "name": null, 00:23:19.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.870 "is_configured": false, 00:23:19.870 "data_offset": 2048, 00:23:19.870 "data_size": 63488 00:23:19.870 }, 00:23:19.870 { 00:23:19.870 "name": "BaseBdev3", 00:23:19.870 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:19.870 "is_configured": true, 00:23:19.870 "data_offset": 2048, 00:23:19.870 "data_size": 63488 00:23:19.870 }, 00:23:19.870 { 00:23:19.870 "name": "BaseBdev4", 00:23:19.870 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:19.870 "is_configured": true, 00:23:19.870 "data_offset": 2048, 00:23:19.870 "data_size": 63488 00:23:19.870 } 00:23:19.870 ] 00:23:19.870 }' 00:23:19.870 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.870 07:59:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:20.439 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:20.439 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.439 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:20.439 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:20.439 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.439 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.439 07:59:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.439 07:59:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.439 "name": "raid_bdev1", 00:23:20.439 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:20.439 "strip_size_kb": 0, 00:23:20.439 "state": "online", 00:23:20.439 "raid_level": "raid1", 00:23:20.439 "superblock": true, 00:23:20.439 "num_base_bdevs": 4, 00:23:20.439 "num_base_bdevs_discovered": 2, 00:23:20.439 "num_base_bdevs_operational": 2, 00:23:20.439 "base_bdevs_list": [ 00:23:20.439 { 00:23:20.439 "name": null, 00:23:20.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.439 "is_configured": false, 00:23:20.439 "data_offset": 2048, 00:23:20.439 "data_size": 63488 00:23:20.439 }, 00:23:20.439 { 00:23:20.439 "name": null, 00:23:20.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.439 "is_configured": false, 00:23:20.439 "data_offset": 2048, 00:23:20.439 "data_size": 63488 00:23:20.439 }, 00:23:20.439 { 00:23:20.439 "name": "BaseBdev3", 00:23:20.439 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:20.439 "is_configured": true, 00:23:20.439 "data_offset": 2048, 00:23:20.439 "data_size": 63488 00:23:20.439 }, 00:23:20.439 { 00:23:20.439 "name": "BaseBdev4", 00:23:20.439 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:20.439 "is_configured": true, 00:23:20.439 "data_offset": 2048, 00:23:20.439 "data_size": 63488 00:23:20.439 } 00:23:20.439 ] 00:23:20.439 }' 00:23:20.440 07:59:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.440 07:59:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:20.440 07:59:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.699 07:59:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:20.699 07:59:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:20.699 07:59:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:20.959 [2024-07-15 07:59:05.603265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:20.959 [2024-07-15 07:59:05.603297] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:20.959 [2024-07-15 07:59:05.603309] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16dbf60 00:23:20.959 [2024-07-15 07:59:05.603315] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:20.959 [2024-07-15 07:59:05.603596] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:20.959 [2024-07-15 07:59:05.603607] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:20.959 [2024-07-15 07:59:05.603652] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:20.959 [2024-07-15 07:59:05.603659] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:23:20.959 [2024-07-15 07:59:05.603665] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:20.959 BaseBdev1 00:23:20.959 07:59:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.900 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.158 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:22.159 "name": "raid_bdev1", 00:23:22.159 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:22.159 "strip_size_kb": 0, 00:23:22.159 "state": "online", 00:23:22.159 "raid_level": "raid1", 00:23:22.159 "superblock": true, 00:23:22.159 "num_base_bdevs": 4, 00:23:22.159 "num_base_bdevs_discovered": 2, 00:23:22.159 "num_base_bdevs_operational": 2, 00:23:22.159 "base_bdevs_list": [ 00:23:22.159 { 00:23:22.159 "name": null, 00:23:22.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.159 "is_configured": false, 00:23:22.159 "data_offset": 2048, 00:23:22.159 "data_size": 63488 00:23:22.159 }, 00:23:22.159 { 00:23:22.159 "name": null, 00:23:22.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.159 "is_configured": false, 00:23:22.159 "data_offset": 2048, 00:23:22.159 "data_size": 63488 00:23:22.159 }, 00:23:22.159 { 00:23:22.159 "name": "BaseBdev3", 00:23:22.159 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:22.159 "is_configured": true, 00:23:22.159 "data_offset": 2048, 00:23:22.159 "data_size": 63488 00:23:22.159 }, 00:23:22.159 { 00:23:22.159 "name": "BaseBdev4", 00:23:22.159 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:22.159 "is_configured": true, 00:23:22.159 "data_offset": 2048, 00:23:22.159 "data_size": 63488 00:23:22.159 } 00:23:22.159 ] 00:23:22.159 }' 00:23:22.159 07:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:22.159 07:59:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:22.725 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:22.725 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:22.725 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:22.725 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:22.725 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:22.725 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.725 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:22.985 "name": "raid_bdev1", 00:23:22.985 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:22.985 "strip_size_kb": 0, 00:23:22.985 "state": "online", 00:23:22.985 "raid_level": "raid1", 00:23:22.985 "superblock": true, 00:23:22.985 "num_base_bdevs": 4, 00:23:22.985 "num_base_bdevs_discovered": 2, 00:23:22.985 "num_base_bdevs_operational": 2, 00:23:22.985 "base_bdevs_list": [ 00:23:22.985 { 00:23:22.985 "name": null, 00:23:22.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.985 "is_configured": false, 00:23:22.985 "data_offset": 2048, 00:23:22.985 "data_size": 63488 00:23:22.985 }, 00:23:22.985 { 00:23:22.985 "name": null, 00:23:22.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.985 "is_configured": false, 00:23:22.985 "data_offset": 2048, 00:23:22.985 "data_size": 63488 00:23:22.985 }, 00:23:22.985 { 00:23:22.985 "name": "BaseBdev3", 00:23:22.985 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:22.985 "is_configured": true, 00:23:22.985 "data_offset": 2048, 00:23:22.985 "data_size": 63488 00:23:22.985 }, 00:23:22.985 { 00:23:22.985 "name": "BaseBdev4", 00:23:22.985 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:22.985 "is_configured": true, 00:23:22.985 "data_offset": 2048, 00:23:22.985 "data_size": 63488 00:23:22.985 } 00:23:22.985 ] 00:23:22.985 }' 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:22.985 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:23.245 [2024-07-15 07:59:07.788807] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:23.245 [2024-07-15 07:59:07.788901] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:23:23.245 [2024-07-15 07:59:07.788910] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:23.245 request: 00:23:23.245 { 00:23:23.245 "base_bdev": "BaseBdev1", 00:23:23.245 "raid_bdev": "raid_bdev1", 00:23:23.245 "method": "bdev_raid_add_base_bdev", 00:23:23.245 "req_id": 1 00:23:23.245 } 00:23:23.245 Got JSON-RPC error response 00:23:23.245 response: 00:23:23.245 { 00:23:23.245 "code": -22, 00:23:23.245 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:23.245 } 00:23:23.245 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:23:23.245 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:23.245 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:23.245 07:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:23.245 07:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.182 07:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.473 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.473 "name": "raid_bdev1", 00:23:24.473 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:24.473 "strip_size_kb": 0, 00:23:24.473 "state": "online", 00:23:24.473 "raid_level": "raid1", 00:23:24.473 "superblock": true, 00:23:24.473 "num_base_bdevs": 4, 00:23:24.473 "num_base_bdevs_discovered": 2, 00:23:24.473 "num_base_bdevs_operational": 2, 00:23:24.473 "base_bdevs_list": [ 00:23:24.473 { 00:23:24.473 "name": null, 00:23:24.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:24.473 "is_configured": false, 00:23:24.473 "data_offset": 2048, 00:23:24.473 "data_size": 63488 00:23:24.473 }, 00:23:24.473 { 00:23:24.473 "name": null, 00:23:24.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:24.473 "is_configured": false, 00:23:24.473 "data_offset": 2048, 00:23:24.473 "data_size": 63488 00:23:24.473 }, 00:23:24.473 { 00:23:24.473 "name": "BaseBdev3", 00:23:24.473 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:24.473 "is_configured": true, 00:23:24.473 "data_offset": 2048, 00:23:24.473 "data_size": 63488 00:23:24.473 }, 00:23:24.473 { 00:23:24.473 "name": "BaseBdev4", 00:23:24.473 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:24.473 "is_configured": true, 00:23:24.473 "data_offset": 2048, 00:23:24.473 "data_size": 63488 00:23:24.473 } 00:23:24.473 ] 00:23:24.473 }' 00:23:24.473 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.473 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:25.043 "name": "raid_bdev1", 00:23:25.043 "uuid": "5b4deca9-df09-4b53-9174-9241095f81c6", 00:23:25.043 "strip_size_kb": 0, 00:23:25.043 "state": "online", 00:23:25.043 "raid_level": "raid1", 00:23:25.043 "superblock": true, 00:23:25.043 "num_base_bdevs": 4, 00:23:25.043 "num_base_bdevs_discovered": 2, 00:23:25.043 "num_base_bdevs_operational": 2, 00:23:25.043 "base_bdevs_list": [ 00:23:25.043 { 00:23:25.043 "name": null, 00:23:25.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:25.043 "is_configured": false, 00:23:25.043 "data_offset": 2048, 00:23:25.043 "data_size": 63488 00:23:25.043 }, 00:23:25.043 { 00:23:25.043 "name": null, 00:23:25.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:25.043 "is_configured": false, 00:23:25.043 "data_offset": 2048, 00:23:25.043 "data_size": 63488 00:23:25.043 }, 00:23:25.043 { 00:23:25.043 "name": "BaseBdev3", 00:23:25.043 "uuid": "a545452e-a3f7-596f-8745-9de3904b13b2", 00:23:25.043 "is_configured": true, 00:23:25.043 "data_offset": 2048, 00:23:25.043 "data_size": 63488 00:23:25.043 }, 00:23:25.043 { 00:23:25.043 "name": "BaseBdev4", 00:23:25.043 "uuid": "f8ab1ac2-5cdb-53cb-9825-8994a3787037", 00:23:25.043 "is_configured": true, 00:23:25.043 "data_offset": 2048, 00:23:25.043 "data_size": 63488 00:23:25.043 } 00:23:25.043 ] 00:23:25.043 }' 00:23:25.043 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:25.309 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1726472 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1726472 ']' 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1726472 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1726472 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1726472' 00:23:25.310 killing process with pid 1726472 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1726472 00:23:25.310 Received shutdown signal, test time was about 60.000000 seconds 00:23:25.310 00:23:25.310 Latency(us) 00:23:25.310 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:25.310 =================================================================================================================== 00:23:25.310 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:25.310 [2024-07-15 07:59:09.893666] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:25.310 [2024-07-15 07:59:09.893745] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:25.310 [2024-07-15 07:59:09.893785] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:25.310 [2024-07-15 07:59:09.893792] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16d3070 name raid_bdev1, state offline 00:23:25.310 07:59:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1726472 00:23:25.310 [2024-07-15 07:59:09.920217] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:25.310 07:59:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:23:25.310 00:23:25.310 real 0m35.944s 00:23:25.310 user 0m50.567s 00:23:25.310 sys 0m5.462s 00:23:25.310 07:59:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:25.310 07:59:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:25.310 ************************************ 00:23:25.310 END TEST raid_rebuild_test_sb 00:23:25.310 ************************************ 00:23:25.571 07:59:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:25.571 07:59:10 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:23:25.571 07:59:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:25.571 07:59:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:25.571 07:59:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:25.571 ************************************ 00:23:25.571 START TEST raid_rebuild_test_io 00:23:25.571 ************************************ 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1732906 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1732906 /var/tmp/spdk-raid.sock 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1732906 ']' 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:25.571 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:25.571 07:59:10 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:25.571 [2024-07-15 07:59:10.184878] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:23:25.572 [2024-07-15 07:59:10.184932] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1732906 ] 00:23:25.572 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:25.572 Zero copy mechanism will not be used. 00:23:25.572 [2024-07-15 07:59:10.276129] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:25.832 [2024-07-15 07:59:10.352836] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:25.832 [2024-07-15 07:59:10.399030] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:25.832 [2024-07-15 07:59:10.399056] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:26.401 07:59:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:26.401 07:59:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:23:26.401 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:26.401 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:26.661 BaseBdev1_malloc 00:23:26.661 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:26.661 [2024-07-15 07:59:11.394648] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:26.661 [2024-07-15 07:59:11.394684] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.661 [2024-07-15 07:59:11.394697] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1458d30 00:23:26.661 [2024-07-15 07:59:11.394704] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.661 [2024-07-15 07:59:11.395994] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.661 [2024-07-15 07:59:11.396013] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:26.661 BaseBdev1 00:23:26.661 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:26.661 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:26.921 BaseBdev2_malloc 00:23:26.921 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:27.181 [2024-07-15 07:59:11.769287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:27.181 [2024-07-15 07:59:11.769312] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:27.181 [2024-07-15 07:59:11.769323] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x160bc60 00:23:27.181 [2024-07-15 07:59:11.769330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:27.181 [2024-07-15 07:59:11.770509] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:27.181 [2024-07-15 07:59:11.770528] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:27.181 BaseBdev2 00:23:27.181 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:27.181 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:27.443 BaseBdev3_malloc 00:23:27.443 07:59:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:27.443 [2024-07-15 07:59:12.148038] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:27.443 [2024-07-15 07:59:12.148065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:27.443 [2024-07-15 07:59:12.148076] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15f0b90 00:23:27.443 [2024-07-15 07:59:12.148082] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:27.443 [2024-07-15 07:59:12.149245] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:27.443 [2024-07-15 07:59:12.149263] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:27.443 BaseBdev3 00:23:27.443 07:59:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:27.443 07:59:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:27.705 BaseBdev4_malloc 00:23:27.705 07:59:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:27.966 [2024-07-15 07:59:12.534671] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:27.966 [2024-07-15 07:59:12.534696] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:27.966 [2024-07-15 07:59:12.534706] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14598c0 00:23:27.966 [2024-07-15 07:59:12.534715] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:27.966 [2024-07-15 07:59:12.535865] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:27.966 [2024-07-15 07:59:12.535882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:27.966 BaseBdev4 00:23:27.966 07:59:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:27.966 spare_malloc 00:23:28.226 07:59:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:28.226 spare_delay 00:23:28.226 07:59:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:28.487 [2024-07-15 07:59:13.081605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:28.487 [2024-07-15 07:59:13.081630] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:28.487 [2024-07-15 07:59:13.081640] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1451a80 00:23:28.487 [2024-07-15 07:59:13.081651] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:28.487 [2024-07-15 07:59:13.082799] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:28.487 [2024-07-15 07:59:13.082817] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:28.487 spare 00:23:28.487 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:28.746 [2024-07-15 07:59:13.262082] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:28.746 [2024-07-15 07:59:13.263047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:28.746 [2024-07-15 07:59:13.263087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:28.746 [2024-07-15 07:59:13.263119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:28.746 [2024-07-15 07:59:13.263175] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1452b30 00:23:28.746 [2024-07-15 07:59:13.263181] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:28.746 [2024-07-15 07:59:13.263330] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1456bf0 00:23:28.746 [2024-07-15 07:59:13.263442] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1452b30 00:23:28.746 [2024-07-15 07:59:13.263448] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1452b30 00:23:28.746 [2024-07-15 07:59:13.263527] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.746 "name": "raid_bdev1", 00:23:28.746 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:28.746 "strip_size_kb": 0, 00:23:28.746 "state": "online", 00:23:28.746 "raid_level": "raid1", 00:23:28.746 "superblock": false, 00:23:28.746 "num_base_bdevs": 4, 00:23:28.746 "num_base_bdevs_discovered": 4, 00:23:28.746 "num_base_bdevs_operational": 4, 00:23:28.746 "base_bdevs_list": [ 00:23:28.746 { 00:23:28.746 "name": "BaseBdev1", 00:23:28.746 "uuid": "576a0f10-6d87-52fe-a7af-18f483b4e0e7", 00:23:28.746 "is_configured": true, 00:23:28.746 "data_offset": 0, 00:23:28.746 "data_size": 65536 00:23:28.746 }, 00:23:28.746 { 00:23:28.746 "name": "BaseBdev2", 00:23:28.746 "uuid": "6679b815-c793-5cd4-8ed8-9fcfd1002f44", 00:23:28.746 "is_configured": true, 00:23:28.746 "data_offset": 0, 00:23:28.746 "data_size": 65536 00:23:28.746 }, 00:23:28.746 { 00:23:28.746 "name": "BaseBdev3", 00:23:28.746 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:28.746 "is_configured": true, 00:23:28.746 "data_offset": 0, 00:23:28.746 "data_size": 65536 00:23:28.746 }, 00:23:28.746 { 00:23:28.746 "name": "BaseBdev4", 00:23:28.746 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:28.746 "is_configured": true, 00:23:28.746 "data_offset": 0, 00:23:28.746 "data_size": 65536 00:23:28.746 } 00:23:28.746 ] 00:23:28.746 }' 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.746 07:59:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:29.314 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:29.314 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:29.573 [2024-07-15 07:59:14.176614] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:29.573 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:29.573 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:29.573 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.832 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:29.832 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:29.832 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:29.832 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:29.832 [2024-07-15 07:59:14.482611] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14581c0 00:23:29.832 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:29.832 Zero copy mechanism will not be used. 00:23:29.832 Running I/O for 60 seconds... 00:23:29.832 [2024-07-15 07:59:14.566470] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:29.832 [2024-07-15 07:59:14.573027] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x14581c0 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.091 "name": "raid_bdev1", 00:23:30.091 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:30.091 "strip_size_kb": 0, 00:23:30.091 "state": "online", 00:23:30.091 "raid_level": "raid1", 00:23:30.091 "superblock": false, 00:23:30.091 "num_base_bdevs": 4, 00:23:30.091 "num_base_bdevs_discovered": 3, 00:23:30.091 "num_base_bdevs_operational": 3, 00:23:30.091 "base_bdevs_list": [ 00:23:30.091 { 00:23:30.091 "name": null, 00:23:30.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.091 "is_configured": false, 00:23:30.091 "data_offset": 0, 00:23:30.091 "data_size": 65536 00:23:30.091 }, 00:23:30.091 { 00:23:30.091 "name": "BaseBdev2", 00:23:30.091 "uuid": "6679b815-c793-5cd4-8ed8-9fcfd1002f44", 00:23:30.091 "is_configured": true, 00:23:30.091 "data_offset": 0, 00:23:30.091 "data_size": 65536 00:23:30.091 }, 00:23:30.091 { 00:23:30.091 "name": "BaseBdev3", 00:23:30.091 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:30.091 "is_configured": true, 00:23:30.091 "data_offset": 0, 00:23:30.091 "data_size": 65536 00:23:30.091 }, 00:23:30.091 { 00:23:30.091 "name": "BaseBdev4", 00:23:30.091 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:30.091 "is_configured": true, 00:23:30.091 "data_offset": 0, 00:23:30.091 "data_size": 65536 00:23:30.091 } 00:23:30.091 ] 00:23:30.091 }' 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.091 07:59:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:30.660 07:59:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:30.921 [2024-07-15 07:59:15.499192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:30.921 07:59:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:30.921 [2024-07-15 07:59:15.556961] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14583f0 00:23:30.921 [2024-07-15 07:59:15.558772] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:31.181 [2024-07-15 07:59:15.681114] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:31.181 [2024-07-15 07:59:15.681889] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:31.181 [2024-07-15 07:59:15.892109] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:31.181 [2024-07-15 07:59:15.892524] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:31.751 [2024-07-15 07:59:16.242264] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:31.751 [2024-07-15 07:59:16.460398] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:32.010 "name": "raid_bdev1", 00:23:32.010 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:32.010 "strip_size_kb": 0, 00:23:32.010 "state": "online", 00:23:32.010 "raid_level": "raid1", 00:23:32.010 "superblock": false, 00:23:32.010 "num_base_bdevs": 4, 00:23:32.010 "num_base_bdevs_discovered": 4, 00:23:32.010 "num_base_bdevs_operational": 4, 00:23:32.010 "process": { 00:23:32.010 "type": "rebuild", 00:23:32.010 "target": "spare", 00:23:32.010 "progress": { 00:23:32.010 "blocks": 12288, 00:23:32.010 "percent": 18 00:23:32.010 } 00:23:32.010 }, 00:23:32.010 "base_bdevs_list": [ 00:23:32.010 { 00:23:32.010 "name": "spare", 00:23:32.010 "uuid": "3fa1b70c-456e-5d4f-9f6e-db1276827162", 00:23:32.010 "is_configured": true, 00:23:32.010 "data_offset": 0, 00:23:32.010 "data_size": 65536 00:23:32.010 }, 00:23:32.010 { 00:23:32.010 "name": "BaseBdev2", 00:23:32.010 "uuid": "6679b815-c793-5cd4-8ed8-9fcfd1002f44", 00:23:32.010 "is_configured": true, 00:23:32.010 "data_offset": 0, 00:23:32.010 "data_size": 65536 00:23:32.010 }, 00:23:32.010 { 00:23:32.010 "name": "BaseBdev3", 00:23:32.010 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:32.010 "is_configured": true, 00:23:32.010 "data_offset": 0, 00:23:32.010 "data_size": 65536 00:23:32.010 }, 00:23:32.010 { 00:23:32.010 "name": "BaseBdev4", 00:23:32.010 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:32.010 "is_configured": true, 00:23:32.010 "data_offset": 0, 00:23:32.010 "data_size": 65536 00:23:32.010 } 00:23:32.010 ] 00:23:32.010 }' 00:23:32.010 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:32.269 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:32.269 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:32.269 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:32.269 07:59:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:32.840 [2024-07-15 07:59:17.353128] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:32.840 [2024-07-15 07:59:17.484067] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:32.840 [2024-07-15 07:59:17.493445] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:32.840 [2024-07-15 07:59:17.493465] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:32.840 [2024-07-15 07:59:17.493471] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:32.840 [2024-07-15 07:59:17.517309] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x14581c0 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.840 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.099 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:33.099 "name": "raid_bdev1", 00:23:33.099 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:33.099 "strip_size_kb": 0, 00:23:33.099 "state": "online", 00:23:33.099 "raid_level": "raid1", 00:23:33.099 "superblock": false, 00:23:33.099 "num_base_bdevs": 4, 00:23:33.099 "num_base_bdevs_discovered": 3, 00:23:33.099 "num_base_bdevs_operational": 3, 00:23:33.099 "base_bdevs_list": [ 00:23:33.099 { 00:23:33.099 "name": null, 00:23:33.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.099 "is_configured": false, 00:23:33.099 "data_offset": 0, 00:23:33.099 "data_size": 65536 00:23:33.099 }, 00:23:33.099 { 00:23:33.100 "name": "BaseBdev2", 00:23:33.100 "uuid": "6679b815-c793-5cd4-8ed8-9fcfd1002f44", 00:23:33.100 "is_configured": true, 00:23:33.100 "data_offset": 0, 00:23:33.100 "data_size": 65536 00:23:33.100 }, 00:23:33.100 { 00:23:33.100 "name": "BaseBdev3", 00:23:33.100 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:33.100 "is_configured": true, 00:23:33.100 "data_offset": 0, 00:23:33.100 "data_size": 65536 00:23:33.100 }, 00:23:33.100 { 00:23:33.100 "name": "BaseBdev4", 00:23:33.100 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:33.100 "is_configured": true, 00:23:33.100 "data_offset": 0, 00:23:33.100 "data_size": 65536 00:23:33.100 } 00:23:33.100 ] 00:23:33.100 }' 00:23:33.100 07:59:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:33.100 07:59:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:33.670 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:33.670 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.670 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:33.670 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:33.670 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.670 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.670 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.931 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.931 "name": "raid_bdev1", 00:23:33.931 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:33.931 "strip_size_kb": 0, 00:23:33.931 "state": "online", 00:23:33.931 "raid_level": "raid1", 00:23:33.931 "superblock": false, 00:23:33.931 "num_base_bdevs": 4, 00:23:33.931 "num_base_bdevs_discovered": 3, 00:23:33.931 "num_base_bdevs_operational": 3, 00:23:33.931 "base_bdevs_list": [ 00:23:33.931 { 00:23:33.931 "name": null, 00:23:33.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.931 "is_configured": false, 00:23:33.931 "data_offset": 0, 00:23:33.931 "data_size": 65536 00:23:33.931 }, 00:23:33.931 { 00:23:33.931 "name": "BaseBdev2", 00:23:33.931 "uuid": "6679b815-c793-5cd4-8ed8-9fcfd1002f44", 00:23:33.931 "is_configured": true, 00:23:33.931 "data_offset": 0, 00:23:33.931 "data_size": 65536 00:23:33.931 }, 00:23:33.931 { 00:23:33.931 "name": "BaseBdev3", 00:23:33.931 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:33.931 "is_configured": true, 00:23:33.931 "data_offset": 0, 00:23:33.931 "data_size": 65536 00:23:33.931 }, 00:23:33.931 { 00:23:33.931 "name": "BaseBdev4", 00:23:33.931 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:33.931 "is_configured": true, 00:23:33.931 "data_offset": 0, 00:23:33.931 "data_size": 65536 00:23:33.931 } 00:23:33.931 ] 00:23:33.931 }' 00:23:33.931 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.931 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:33.931 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.931 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:33.931 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:34.193 [2024-07-15 07:59:18.786062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:34.193 07:59:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:34.193 [2024-07-15 07:59:18.843676] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x146eac0 00:23:34.193 [2024-07-15 07:59:18.844856] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:34.453 [2024-07-15 07:59:18.954143] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:34.453 [2024-07-15 07:59:18.954913] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:34.453 [2024-07-15 07:59:19.181291] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:34.453 [2024-07-15 07:59:19.181434] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:35.022 [2024-07-15 07:59:19.551304] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:35.282 07:59:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:35.282 07:59:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.282 07:59:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:35.282 07:59:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:35.282 07:59:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.282 07:59:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.282 07:59:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.282 [2024-07-15 07:59:19.884122] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.543 "name": "raid_bdev1", 00:23:35.543 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:35.543 "strip_size_kb": 0, 00:23:35.543 "state": "online", 00:23:35.543 "raid_level": "raid1", 00:23:35.543 "superblock": false, 00:23:35.543 "num_base_bdevs": 4, 00:23:35.543 "num_base_bdevs_discovered": 4, 00:23:35.543 "num_base_bdevs_operational": 4, 00:23:35.543 "process": { 00:23:35.543 "type": "rebuild", 00:23:35.543 "target": "spare", 00:23:35.543 "progress": { 00:23:35.543 "blocks": 14336, 00:23:35.543 "percent": 21 00:23:35.543 } 00:23:35.543 }, 00:23:35.543 "base_bdevs_list": [ 00:23:35.543 { 00:23:35.543 "name": "spare", 00:23:35.543 "uuid": "3fa1b70c-456e-5d4f-9f6e-db1276827162", 00:23:35.543 "is_configured": true, 00:23:35.543 "data_offset": 0, 00:23:35.543 "data_size": 65536 00:23:35.543 }, 00:23:35.543 { 00:23:35.543 "name": "BaseBdev2", 00:23:35.543 "uuid": "6679b815-c793-5cd4-8ed8-9fcfd1002f44", 00:23:35.543 "is_configured": true, 00:23:35.543 "data_offset": 0, 00:23:35.543 "data_size": 65536 00:23:35.543 }, 00:23:35.543 { 00:23:35.543 "name": "BaseBdev3", 00:23:35.543 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:35.543 "is_configured": true, 00:23:35.543 "data_offset": 0, 00:23:35.543 "data_size": 65536 00:23:35.543 }, 00:23:35.543 { 00:23:35.543 "name": "BaseBdev4", 00:23:35.543 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:35.543 "is_configured": true, 00:23:35.543 "data_offset": 0, 00:23:35.543 "data_size": 65536 00:23:35.543 } 00:23:35.543 ] 00:23:35.543 }' 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.543 [2024-07-15 07:59:20.087087] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:35.543 [2024-07-15 07:59:20.087249] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:35.543 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:35.803 [2024-07-15 07:59:20.309288] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:35.803 [2024-07-15 07:59:20.329062] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:35.803 [2024-07-15 07:59:20.329321] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:35.803 [2024-07-15 07:59:20.342819] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x14581c0 00:23:35.804 [2024-07-15 07:59:20.342836] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x146eac0 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.804 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.804 [2024-07-15 07:59:20.453693] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.064 "name": "raid_bdev1", 00:23:36.064 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:36.064 "strip_size_kb": 0, 00:23:36.064 "state": "online", 00:23:36.064 "raid_level": "raid1", 00:23:36.064 "superblock": false, 00:23:36.064 "num_base_bdevs": 4, 00:23:36.064 "num_base_bdevs_discovered": 3, 00:23:36.064 "num_base_bdevs_operational": 3, 00:23:36.064 "process": { 00:23:36.064 "type": "rebuild", 00:23:36.064 "target": "spare", 00:23:36.064 "progress": { 00:23:36.064 "blocks": 22528, 00:23:36.064 "percent": 34 00:23:36.064 } 00:23:36.064 }, 00:23:36.064 "base_bdevs_list": [ 00:23:36.064 { 00:23:36.064 "name": "spare", 00:23:36.064 "uuid": "3fa1b70c-456e-5d4f-9f6e-db1276827162", 00:23:36.064 "is_configured": true, 00:23:36.064 "data_offset": 0, 00:23:36.064 "data_size": 65536 00:23:36.064 }, 00:23:36.064 { 00:23:36.064 "name": null, 00:23:36.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.064 "is_configured": false, 00:23:36.064 "data_offset": 0, 00:23:36.064 "data_size": 65536 00:23:36.064 }, 00:23:36.064 { 00:23:36.064 "name": "BaseBdev3", 00:23:36.064 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:36.064 "is_configured": true, 00:23:36.064 "data_offset": 0, 00:23:36.064 "data_size": 65536 00:23:36.064 }, 00:23:36.064 { 00:23:36.064 "name": "BaseBdev4", 00:23:36.064 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:36.064 "is_configured": true, 00:23:36.064 "data_offset": 0, 00:23:36.064 "data_size": 65536 00:23:36.064 } 00:23:36.064 ] 00:23:36.064 }' 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=826 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.064 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.064 [2024-07-15 07:59:20.700344] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:36.064 [2024-07-15 07:59:20.808522] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:36.324 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.324 "name": "raid_bdev1", 00:23:36.324 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:36.324 "strip_size_kb": 0, 00:23:36.324 "state": "online", 00:23:36.324 "raid_level": "raid1", 00:23:36.324 "superblock": false, 00:23:36.324 "num_base_bdevs": 4, 00:23:36.324 "num_base_bdevs_discovered": 3, 00:23:36.324 "num_base_bdevs_operational": 3, 00:23:36.324 "process": { 00:23:36.324 "type": "rebuild", 00:23:36.324 "target": "spare", 00:23:36.324 "progress": { 00:23:36.324 "blocks": 28672, 00:23:36.324 "percent": 43 00:23:36.324 } 00:23:36.324 }, 00:23:36.324 "base_bdevs_list": [ 00:23:36.324 { 00:23:36.324 "name": "spare", 00:23:36.324 "uuid": "3fa1b70c-456e-5d4f-9f6e-db1276827162", 00:23:36.324 "is_configured": true, 00:23:36.324 "data_offset": 0, 00:23:36.324 "data_size": 65536 00:23:36.324 }, 00:23:36.324 { 00:23:36.324 "name": null, 00:23:36.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.324 "is_configured": false, 00:23:36.324 "data_offset": 0, 00:23:36.324 "data_size": 65536 00:23:36.324 }, 00:23:36.324 { 00:23:36.324 "name": "BaseBdev3", 00:23:36.324 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:36.324 "is_configured": true, 00:23:36.324 "data_offset": 0, 00:23:36.324 "data_size": 65536 00:23:36.324 }, 00:23:36.324 { 00:23:36.324 "name": "BaseBdev4", 00:23:36.324 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:36.324 "is_configured": true, 00:23:36.324 "data_offset": 0, 00:23:36.324 "data_size": 65536 00:23:36.324 } 00:23:36.324 ] 00:23:36.324 }' 00:23:36.324 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.324 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:36.324 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.324 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:36.324 07:59:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:36.585 [2024-07-15 07:59:21.156686] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:36.844 [2024-07-15 07:59:21.388430] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:36.844 [2024-07-15 07:59:21.388578] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:37.103 [2024-07-15 07:59:21.729232] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:37.103 [2024-07-15 07:59:21.729357] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:37.362 07:59:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:37.362 07:59:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:37.362 07:59:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:37.362 07:59:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:37.362 07:59:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:37.362 07:59:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:37.362 07:59:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.362 07:59:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.622 07:59:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:37.622 "name": "raid_bdev1", 00:23:37.622 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:37.622 "strip_size_kb": 0, 00:23:37.622 "state": "online", 00:23:37.622 "raid_level": "raid1", 00:23:37.622 "superblock": false, 00:23:37.622 "num_base_bdevs": 4, 00:23:37.622 "num_base_bdevs_discovered": 3, 00:23:37.622 "num_base_bdevs_operational": 3, 00:23:37.622 "process": { 00:23:37.622 "type": "rebuild", 00:23:37.622 "target": "spare", 00:23:37.622 "progress": { 00:23:37.622 "blocks": 47104, 00:23:37.622 "percent": 71 00:23:37.622 } 00:23:37.622 }, 00:23:37.622 "base_bdevs_list": [ 00:23:37.622 { 00:23:37.622 "name": "spare", 00:23:37.622 "uuid": "3fa1b70c-456e-5d4f-9f6e-db1276827162", 00:23:37.622 "is_configured": true, 00:23:37.622 "data_offset": 0, 00:23:37.622 "data_size": 65536 00:23:37.622 }, 00:23:37.622 { 00:23:37.622 "name": null, 00:23:37.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.622 "is_configured": false, 00:23:37.622 "data_offset": 0, 00:23:37.622 "data_size": 65536 00:23:37.622 }, 00:23:37.622 { 00:23:37.622 "name": "BaseBdev3", 00:23:37.622 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:37.622 "is_configured": true, 00:23:37.622 "data_offset": 0, 00:23:37.622 "data_size": 65536 00:23:37.622 }, 00:23:37.622 { 00:23:37.622 "name": "BaseBdev4", 00:23:37.622 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:37.622 "is_configured": true, 00:23:37.622 "data_offset": 0, 00:23:37.622 "data_size": 65536 00:23:37.622 } 00:23:37.622 ] 00:23:37.622 }' 00:23:37.622 07:59:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:37.622 07:59:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:37.622 07:59:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:37.622 07:59:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:37.622 07:59:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:37.622 [2024-07-15 07:59:22.286191] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:37.881 [2024-07-15 07:59:22.393647] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:23:38.142 [2024-07-15 07:59:22.810751] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:23:38.402 [2024-07-15 07:59:23.151091] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:38.662 [2024-07-15 07:59:23.251312] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:38.662 [2024-07-15 07:59:23.252726] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:38.662 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:38.662 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:38.662 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.662 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:38.662 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:38.662 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.662 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.662 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:38.922 "name": "raid_bdev1", 00:23:38.922 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:38.922 "strip_size_kb": 0, 00:23:38.922 "state": "online", 00:23:38.922 "raid_level": "raid1", 00:23:38.922 "superblock": false, 00:23:38.922 "num_base_bdevs": 4, 00:23:38.922 "num_base_bdevs_discovered": 3, 00:23:38.922 "num_base_bdevs_operational": 3, 00:23:38.922 "base_bdevs_list": [ 00:23:38.922 { 00:23:38.922 "name": "spare", 00:23:38.922 "uuid": "3fa1b70c-456e-5d4f-9f6e-db1276827162", 00:23:38.922 "is_configured": true, 00:23:38.922 "data_offset": 0, 00:23:38.922 "data_size": 65536 00:23:38.922 }, 00:23:38.922 { 00:23:38.922 "name": null, 00:23:38.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.922 "is_configured": false, 00:23:38.922 "data_offset": 0, 00:23:38.922 "data_size": 65536 00:23:38.922 }, 00:23:38.922 { 00:23:38.922 "name": "BaseBdev3", 00:23:38.922 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:38.922 "is_configured": true, 00:23:38.922 "data_offset": 0, 00:23:38.922 "data_size": 65536 00:23:38.922 }, 00:23:38.922 { 00:23:38.922 "name": "BaseBdev4", 00:23:38.922 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:38.922 "is_configured": true, 00:23:38.922 "data_offset": 0, 00:23:38.922 "data_size": 65536 00:23:38.922 } 00:23:38.922 ] 00:23:38.922 }' 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.922 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.182 "name": "raid_bdev1", 00:23:39.182 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:39.182 "strip_size_kb": 0, 00:23:39.182 "state": "online", 00:23:39.182 "raid_level": "raid1", 00:23:39.182 "superblock": false, 00:23:39.182 "num_base_bdevs": 4, 00:23:39.182 "num_base_bdevs_discovered": 3, 00:23:39.182 "num_base_bdevs_operational": 3, 00:23:39.182 "base_bdevs_list": [ 00:23:39.182 { 00:23:39.182 "name": "spare", 00:23:39.182 "uuid": "3fa1b70c-456e-5d4f-9f6e-db1276827162", 00:23:39.182 "is_configured": true, 00:23:39.182 "data_offset": 0, 00:23:39.182 "data_size": 65536 00:23:39.182 }, 00:23:39.182 { 00:23:39.182 "name": null, 00:23:39.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:39.182 "is_configured": false, 00:23:39.182 "data_offset": 0, 00:23:39.182 "data_size": 65536 00:23:39.182 }, 00:23:39.182 { 00:23:39.182 "name": "BaseBdev3", 00:23:39.182 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:39.182 "is_configured": true, 00:23:39.182 "data_offset": 0, 00:23:39.182 "data_size": 65536 00:23:39.182 }, 00:23:39.182 { 00:23:39.182 "name": "BaseBdev4", 00:23:39.182 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:39.182 "is_configured": true, 00:23:39.182 "data_offset": 0, 00:23:39.182 "data_size": 65536 00:23:39.182 } 00:23:39.182 ] 00:23:39.182 }' 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.182 07:59:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.442 07:59:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.442 "name": "raid_bdev1", 00:23:39.442 "uuid": "db7b42d1-df45-4adc-8670-b628ec0a0589", 00:23:39.442 "strip_size_kb": 0, 00:23:39.442 "state": "online", 00:23:39.442 "raid_level": "raid1", 00:23:39.442 "superblock": false, 00:23:39.442 "num_base_bdevs": 4, 00:23:39.442 "num_base_bdevs_discovered": 3, 00:23:39.442 "num_base_bdevs_operational": 3, 00:23:39.442 "base_bdevs_list": [ 00:23:39.442 { 00:23:39.442 "name": "spare", 00:23:39.442 "uuid": "3fa1b70c-456e-5d4f-9f6e-db1276827162", 00:23:39.442 "is_configured": true, 00:23:39.442 "data_offset": 0, 00:23:39.442 "data_size": 65536 00:23:39.442 }, 00:23:39.442 { 00:23:39.442 "name": null, 00:23:39.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:39.442 "is_configured": false, 00:23:39.442 "data_offset": 0, 00:23:39.442 "data_size": 65536 00:23:39.442 }, 00:23:39.442 { 00:23:39.442 "name": "BaseBdev3", 00:23:39.442 "uuid": "fcac5f30-ed11-5a35-86bd-65a3d0b64bc1", 00:23:39.442 "is_configured": true, 00:23:39.442 "data_offset": 0, 00:23:39.442 "data_size": 65536 00:23:39.442 }, 00:23:39.442 { 00:23:39.442 "name": "BaseBdev4", 00:23:39.442 "uuid": "a4a0bdd3-7e8f-5a36-8cf4-323822d2e373", 00:23:39.442 "is_configured": true, 00:23:39.442 "data_offset": 0, 00:23:39.442 "data_size": 65536 00:23:39.442 } 00:23:39.442 ] 00:23:39.442 }' 00:23:39.442 07:59:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.442 07:59:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:40.013 07:59:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:40.274 [2024-07-15 07:59:25.020608] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:40.274 [2024-07-15 07:59:25.020633] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:40.618 00:23:40.618 Latency(us) 00:23:40.618 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:40.618 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:40.618 raid_bdev1 : 10.58 99.97 299.90 0.00 0.00 13420.13 248.91 109697.18 00:23:40.618 =================================================================================================================== 00:23:40.618 Total : 99.97 299.90 0.00 0.00 13420.13 248.91 109697.18 00:23:40.618 [2024-07-15 07:59:25.095991] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:40.618 [2024-07-15 07:59:25.096016] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:40.618 [2024-07-15 07:59:25.096094] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:40.618 [2024-07-15 07:59:25.096101] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1452b30 name raid_bdev1, state offline 00:23:40.618 0 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:40.618 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:41.190 /dev/nbd0 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:41.190 1+0 records in 00:23:41.190 1+0 records out 00:23:41.190 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268006 s, 15.3 MB/s 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:41.190 07:59:25 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:23:41.452 /dev/nbd1 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:41.452 1+0 records in 00:23:41.452 1+0 records out 00:23:41.452 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279751 s, 14.6 MB/s 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:41.452 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:41.713 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:23:42.284 /dev/nbd1 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:42.284 1+0 records in 00:23:42.284 1+0 records out 00:23:42.284 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000336695 s, 12.2 MB/s 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:42.284 07:59:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:42.544 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:42.544 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:42.544 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:42.544 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:42.544 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:42.544 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:42.544 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1732906 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1732906 ']' 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1732906 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1732906 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1732906' 00:23:43.112 killing process with pid 1732906 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1732906 00:23:43.112 Received shutdown signal, test time was about 13.337902 seconds 00:23:43.112 00:23:43.112 Latency(us) 00:23:43.112 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:43.112 =================================================================================================================== 00:23:43.112 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:43.112 [2024-07-15 07:59:27.853561] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:43.112 07:59:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1732906 00:23:43.371 [2024-07-15 07:59:27.876605] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:43.371 07:59:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:43.371 00:23:43.371 real 0m17.881s 00:23:43.371 user 0m28.441s 00:23:43.371 sys 0m2.556s 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:43.372 ************************************ 00:23:43.372 END TEST raid_rebuild_test_io 00:23:43.372 ************************************ 00:23:43.372 07:59:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:43.372 07:59:28 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:23:43.372 07:59:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:43.372 07:59:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:43.372 07:59:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:43.372 ************************************ 00:23:43.372 START TEST raid_rebuild_test_sb_io 00:23:43.372 ************************************ 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1736050 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1736050 /var/tmp/spdk-raid.sock 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1736050 ']' 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:43.372 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:43.372 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:43.631 [2024-07-15 07:59:28.143060] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:23:43.631 [2024-07-15 07:59:28.143104] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736050 ] 00:23:43.631 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:43.631 Zero copy mechanism will not be used. 00:23:43.631 [2024-07-15 07:59:28.220132] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:43.631 [2024-07-15 07:59:28.285176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:43.631 [2024-07-15 07:59:28.323727] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:43.631 [2024-07-15 07:59:28.323751] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:44.572 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:44.572 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:23:44.572 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:44.572 07:59:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:44.572 BaseBdev1_malloc 00:23:44.572 07:59:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:44.833 [2024-07-15 07:59:29.341712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:44.833 [2024-07-15 07:59:29.341748] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:44.833 [2024-07-15 07:59:29.341761] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a4d30 00:23:44.833 [2024-07-15 07:59:29.341767] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:44.833 [2024-07-15 07:59:29.343111] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:44.833 [2024-07-15 07:59:29.343131] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:44.833 BaseBdev1 00:23:44.833 07:59:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:44.833 07:59:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:44.833 BaseBdev2_malloc 00:23:44.833 07:59:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:45.093 [2024-07-15 07:59:29.716450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:45.093 [2024-07-15 07:59:29.716475] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.093 [2024-07-15 07:59:29.716485] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1857c60 00:23:45.093 [2024-07-15 07:59:29.716491] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.093 [2024-07-15 07:59:29.717641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.093 [2024-07-15 07:59:29.717659] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:45.093 BaseBdev2 00:23:45.093 07:59:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:45.093 07:59:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:45.354 BaseBdev3_malloc 00:23:45.354 07:59:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:45.354 [2024-07-15 07:59:30.107020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:45.354 [2024-07-15 07:59:30.107053] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.354 [2024-07-15 07:59:30.107065] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x183cb90 00:23:45.354 [2024-07-15 07:59:30.107072] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.354 [2024-07-15 07:59:30.108263] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.354 [2024-07-15 07:59:30.108281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:45.615 BaseBdev3 00:23:45.615 07:59:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:45.615 07:59:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:45.615 BaseBdev4_malloc 00:23:45.615 07:59:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:45.875 [2024-07-15 07:59:30.481507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:45.875 [2024-07-15 07:59:30.481535] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.875 [2024-07-15 07:59:30.481545] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a58c0 00:23:45.875 [2024-07-15 07:59:30.481556] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.875 [2024-07-15 07:59:30.482716] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.875 [2024-07-15 07:59:30.482733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:45.875 BaseBdev4 00:23:45.875 07:59:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:46.136 spare_malloc 00:23:46.136 07:59:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:46.136 spare_delay 00:23:46.136 07:59:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:46.396 [2024-07-15 07:59:31.024616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:46.396 [2024-07-15 07:59:31.024645] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.396 [2024-07-15 07:59:31.024656] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x169da80 00:23:46.396 [2024-07-15 07:59:31.024662] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.396 [2024-07-15 07:59:31.025859] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.396 [2024-07-15 07:59:31.025877] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:46.396 spare 00:23:46.396 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:46.656 [2024-07-15 07:59:31.205128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:46.656 [2024-07-15 07:59:31.206096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:46.656 [2024-07-15 07:59:31.206136] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:46.656 [2024-07-15 07:59:31.206169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:46.656 [2024-07-15 07:59:31.206305] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x169eb30 00:23:46.656 [2024-07-15 07:59:31.206312] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:46.656 [2024-07-15 07:59:31.206455] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169c2e0 00:23:46.656 [2024-07-15 07:59:31.206569] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x169eb30 00:23:46.656 [2024-07-15 07:59:31.206574] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x169eb30 00:23:46.656 [2024-07-15 07:59:31.206642] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:46.656 "name": "raid_bdev1", 00:23:46.656 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:46.656 "strip_size_kb": 0, 00:23:46.656 "state": "online", 00:23:46.656 "raid_level": "raid1", 00:23:46.656 "superblock": true, 00:23:46.656 "num_base_bdevs": 4, 00:23:46.656 "num_base_bdevs_discovered": 4, 00:23:46.656 "num_base_bdevs_operational": 4, 00:23:46.656 "base_bdevs_list": [ 00:23:46.656 { 00:23:46.656 "name": "BaseBdev1", 00:23:46.656 "uuid": "78ba4252-f0e4-55c8-8a73-c90d45dc867a", 00:23:46.656 "is_configured": true, 00:23:46.656 "data_offset": 2048, 00:23:46.656 "data_size": 63488 00:23:46.656 }, 00:23:46.656 { 00:23:46.656 "name": "BaseBdev2", 00:23:46.656 "uuid": "815ebc02-ad2b-56e1-b5d3-11b56274e545", 00:23:46.656 "is_configured": true, 00:23:46.656 "data_offset": 2048, 00:23:46.656 "data_size": 63488 00:23:46.656 }, 00:23:46.656 { 00:23:46.656 "name": "BaseBdev3", 00:23:46.656 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:46.656 "is_configured": true, 00:23:46.656 "data_offset": 2048, 00:23:46.656 "data_size": 63488 00:23:46.656 }, 00:23:46.656 { 00:23:46.656 "name": "BaseBdev4", 00:23:46.656 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:46.656 "is_configured": true, 00:23:46.656 "data_offset": 2048, 00:23:46.656 "data_size": 63488 00:23:46.656 } 00:23:46.656 ] 00:23:46.656 }' 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:46.656 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:47.226 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:47.226 07:59:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:47.487 [2024-07-15 07:59:32.123666] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:47.487 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:47.487 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.487 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:47.748 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:47.748 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:47.748 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:47.748 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:47.748 [2024-07-15 07:59:32.429631] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16a41a0 00:23:47.748 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:47.748 Zero copy mechanism will not be used. 00:23:47.748 Running I/O for 60 seconds... 00:23:48.009 [2024-07-15 07:59:32.513147] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:48.009 [2024-07-15 07:59:32.519758] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x16a41a0 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:48.009 "name": "raid_bdev1", 00:23:48.009 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:48.009 "strip_size_kb": 0, 00:23:48.009 "state": "online", 00:23:48.009 "raid_level": "raid1", 00:23:48.009 "superblock": true, 00:23:48.009 "num_base_bdevs": 4, 00:23:48.009 "num_base_bdevs_discovered": 3, 00:23:48.009 "num_base_bdevs_operational": 3, 00:23:48.009 "base_bdevs_list": [ 00:23:48.009 { 00:23:48.009 "name": null, 00:23:48.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:48.009 "is_configured": false, 00:23:48.009 "data_offset": 2048, 00:23:48.009 "data_size": 63488 00:23:48.009 }, 00:23:48.009 { 00:23:48.009 "name": "BaseBdev2", 00:23:48.009 "uuid": "815ebc02-ad2b-56e1-b5d3-11b56274e545", 00:23:48.009 "is_configured": true, 00:23:48.009 "data_offset": 2048, 00:23:48.009 "data_size": 63488 00:23:48.009 }, 00:23:48.009 { 00:23:48.009 "name": "BaseBdev3", 00:23:48.009 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:48.009 "is_configured": true, 00:23:48.009 "data_offset": 2048, 00:23:48.009 "data_size": 63488 00:23:48.009 }, 00:23:48.009 { 00:23:48.009 "name": "BaseBdev4", 00:23:48.009 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:48.009 "is_configured": true, 00:23:48.009 "data_offset": 2048, 00:23:48.009 "data_size": 63488 00:23:48.009 } 00:23:48.009 ] 00:23:48.009 }' 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:48.009 07:59:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:48.580 07:59:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:49.149 [2024-07-15 07:59:33.827910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:49.149 [2024-07-15 07:59:33.870421] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x173c090 00:23:49.149 [2024-07-15 07:59:33.872070] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:49.149 07:59:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:49.409 [2024-07-15 07:59:34.002591] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:49.409 [2024-07-15 07:59:34.003379] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:49.669 [2024-07-15 07:59:34.205358] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:49.669 [2024-07-15 07:59:34.205476] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:49.929 [2024-07-15 07:59:34.454017] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:49.929 [2024-07-15 07:59:34.454249] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:49.929 [2024-07-15 07:59:34.578680] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:50.190 [2024-07-15 07:59:34.813021] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:50.190 07:59:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:50.190 07:59:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.190 07:59:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:50.190 07:59:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:50.190 07:59:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.190 07:59:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.190 07:59:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.449 [2024-07-15 07:59:35.045255] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:50.449 [2024-07-15 07:59:35.045433] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:50.449 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.449 "name": "raid_bdev1", 00:23:50.450 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:50.450 "strip_size_kb": 0, 00:23:50.450 "state": "online", 00:23:50.450 "raid_level": "raid1", 00:23:50.450 "superblock": true, 00:23:50.450 "num_base_bdevs": 4, 00:23:50.450 "num_base_bdevs_discovered": 4, 00:23:50.450 "num_base_bdevs_operational": 4, 00:23:50.450 "process": { 00:23:50.450 "type": "rebuild", 00:23:50.450 "target": "spare", 00:23:50.450 "progress": { 00:23:50.450 "blocks": 16384, 00:23:50.450 "percent": 25 00:23:50.450 } 00:23:50.450 }, 00:23:50.450 "base_bdevs_list": [ 00:23:50.450 { 00:23:50.450 "name": "spare", 00:23:50.450 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:23:50.450 "is_configured": true, 00:23:50.450 "data_offset": 2048, 00:23:50.450 "data_size": 63488 00:23:50.450 }, 00:23:50.450 { 00:23:50.450 "name": "BaseBdev2", 00:23:50.450 "uuid": "815ebc02-ad2b-56e1-b5d3-11b56274e545", 00:23:50.450 "is_configured": true, 00:23:50.450 "data_offset": 2048, 00:23:50.450 "data_size": 63488 00:23:50.450 }, 00:23:50.450 { 00:23:50.450 "name": "BaseBdev3", 00:23:50.450 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:50.450 "is_configured": true, 00:23:50.450 "data_offset": 2048, 00:23:50.450 "data_size": 63488 00:23:50.450 }, 00:23:50.450 { 00:23:50.450 "name": "BaseBdev4", 00:23:50.450 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:50.450 "is_configured": true, 00:23:50.450 "data_offset": 2048, 00:23:50.450 "data_size": 63488 00:23:50.450 } 00:23:50.450 ] 00:23:50.450 }' 00:23:50.450 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.450 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:50.450 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.450 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:50.450 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:50.711 [2024-07-15 07:59:35.396344] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:50.971 [2024-07-15 07:59:35.681746] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:51.232 [2024-07-15 07:59:35.875093] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:51.232 [2024-07-15 07:59:35.877288] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:51.232 [2024-07-15 07:59:35.877310] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:51.232 [2024-07-15 07:59:35.877316] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:51.232 [2024-07-15 07:59:35.895647] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x16a41a0 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.232 07:59:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.492 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:51.492 "name": "raid_bdev1", 00:23:51.492 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:51.492 "strip_size_kb": 0, 00:23:51.492 "state": "online", 00:23:51.492 "raid_level": "raid1", 00:23:51.492 "superblock": true, 00:23:51.492 "num_base_bdevs": 4, 00:23:51.492 "num_base_bdevs_discovered": 3, 00:23:51.492 "num_base_bdevs_operational": 3, 00:23:51.492 "base_bdevs_list": [ 00:23:51.492 { 00:23:51.492 "name": null, 00:23:51.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.492 "is_configured": false, 00:23:51.492 "data_offset": 2048, 00:23:51.492 "data_size": 63488 00:23:51.492 }, 00:23:51.492 { 00:23:51.492 "name": "BaseBdev2", 00:23:51.492 "uuid": "815ebc02-ad2b-56e1-b5d3-11b56274e545", 00:23:51.492 "is_configured": true, 00:23:51.492 "data_offset": 2048, 00:23:51.492 "data_size": 63488 00:23:51.492 }, 00:23:51.492 { 00:23:51.492 "name": "BaseBdev3", 00:23:51.492 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:51.492 "is_configured": true, 00:23:51.492 "data_offset": 2048, 00:23:51.492 "data_size": 63488 00:23:51.492 }, 00:23:51.492 { 00:23:51.492 "name": "BaseBdev4", 00:23:51.492 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:51.492 "is_configured": true, 00:23:51.492 "data_offset": 2048, 00:23:51.492 "data_size": 63488 00:23:51.492 } 00:23:51.492 ] 00:23:51.492 }' 00:23:51.492 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:51.492 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:52.062 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:52.062 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.062 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:52.062 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:52.062 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.062 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.062 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.323 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.323 "name": "raid_bdev1", 00:23:52.323 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:52.323 "strip_size_kb": 0, 00:23:52.323 "state": "online", 00:23:52.323 "raid_level": "raid1", 00:23:52.323 "superblock": true, 00:23:52.323 "num_base_bdevs": 4, 00:23:52.323 "num_base_bdevs_discovered": 3, 00:23:52.323 "num_base_bdevs_operational": 3, 00:23:52.323 "base_bdevs_list": [ 00:23:52.323 { 00:23:52.323 "name": null, 00:23:52.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:52.323 "is_configured": false, 00:23:52.323 "data_offset": 2048, 00:23:52.323 "data_size": 63488 00:23:52.323 }, 00:23:52.323 { 00:23:52.323 "name": "BaseBdev2", 00:23:52.323 "uuid": "815ebc02-ad2b-56e1-b5d3-11b56274e545", 00:23:52.323 "is_configured": true, 00:23:52.323 "data_offset": 2048, 00:23:52.323 "data_size": 63488 00:23:52.323 }, 00:23:52.323 { 00:23:52.323 "name": "BaseBdev3", 00:23:52.323 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:52.323 "is_configured": true, 00:23:52.323 "data_offset": 2048, 00:23:52.323 "data_size": 63488 00:23:52.323 }, 00:23:52.323 { 00:23:52.323 "name": "BaseBdev4", 00:23:52.323 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:52.323 "is_configured": true, 00:23:52.323 "data_offset": 2048, 00:23:52.323 "data_size": 63488 00:23:52.323 } 00:23:52.323 ] 00:23:52.323 }' 00:23:52.323 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.323 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:52.323 07:59:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.323 07:59:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:52.323 07:59:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:52.583 [2024-07-15 07:59:37.210650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:52.583 07:59:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:52.583 [2024-07-15 07:59:37.260792] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16a43d0 00:23:52.583 [2024-07-15 07:59:37.261980] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:52.842 [2024-07-15 07:59:37.370605] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:52.842 [2024-07-15 07:59:37.370863] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:52.842 [2024-07-15 07:59:37.508612] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:52.842 [2024-07-15 07:59:37.509018] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:53.102 [2024-07-15 07:59:37.851909] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:53.102 [2024-07-15 07:59:37.852599] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:53.362 [2024-07-15 07:59:38.064898] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:53.362 [2024-07-15 07:59:38.065013] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:53.622 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:53.622 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.622 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:53.622 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:53.622 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.622 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.622 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.881 [2024-07-15 07:59:38.425381] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:53.881 "name": "raid_bdev1", 00:23:53.881 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:53.881 "strip_size_kb": 0, 00:23:53.881 "state": "online", 00:23:53.881 "raid_level": "raid1", 00:23:53.881 "superblock": true, 00:23:53.881 "num_base_bdevs": 4, 00:23:53.881 "num_base_bdevs_discovered": 4, 00:23:53.881 "num_base_bdevs_operational": 4, 00:23:53.881 "process": { 00:23:53.881 "type": "rebuild", 00:23:53.881 "target": "spare", 00:23:53.881 "progress": { 00:23:53.881 "blocks": 14336, 00:23:53.881 "percent": 22 00:23:53.881 } 00:23:53.881 }, 00:23:53.881 "base_bdevs_list": [ 00:23:53.881 { 00:23:53.881 "name": "spare", 00:23:53.881 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:23:53.881 "is_configured": true, 00:23:53.881 "data_offset": 2048, 00:23:53.881 "data_size": 63488 00:23:53.881 }, 00:23:53.881 { 00:23:53.881 "name": "BaseBdev2", 00:23:53.881 "uuid": "815ebc02-ad2b-56e1-b5d3-11b56274e545", 00:23:53.881 "is_configured": true, 00:23:53.881 "data_offset": 2048, 00:23:53.881 "data_size": 63488 00:23:53.881 }, 00:23:53.881 { 00:23:53.881 "name": "BaseBdev3", 00:23:53.881 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:53.881 "is_configured": true, 00:23:53.881 "data_offset": 2048, 00:23:53.881 "data_size": 63488 00:23:53.881 }, 00:23:53.881 { 00:23:53.881 "name": "BaseBdev4", 00:23:53.881 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:53.881 "is_configured": true, 00:23:53.881 "data_offset": 2048, 00:23:53.881 "data_size": 63488 00:23:53.881 } 00:23:53.881 ] 00:23:53.881 }' 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:53.881 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:53.881 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:54.141 [2024-07-15 07:59:38.688645] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:54.141 [2024-07-15 07:59:38.751263] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:54.401 [2024-07-15 07:59:38.936617] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x16a41a0 00:23:54.401 [2024-07-15 07:59:38.936635] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x16a43d0 00:23:54.401 [2024-07-15 07:59:38.937028] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.401 07:59:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.662 "name": "raid_bdev1", 00:23:54.662 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:54.662 "strip_size_kb": 0, 00:23:54.662 "state": "online", 00:23:54.662 "raid_level": "raid1", 00:23:54.662 "superblock": true, 00:23:54.662 "num_base_bdevs": 4, 00:23:54.662 "num_base_bdevs_discovered": 3, 00:23:54.662 "num_base_bdevs_operational": 3, 00:23:54.662 "process": { 00:23:54.662 "type": "rebuild", 00:23:54.662 "target": "spare", 00:23:54.662 "progress": { 00:23:54.662 "blocks": 24576, 00:23:54.662 "percent": 38 00:23:54.662 } 00:23:54.662 }, 00:23:54.662 "base_bdevs_list": [ 00:23:54.662 { 00:23:54.662 "name": "spare", 00:23:54.662 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:23:54.662 "is_configured": true, 00:23:54.662 "data_offset": 2048, 00:23:54.662 "data_size": 63488 00:23:54.662 }, 00:23:54.662 { 00:23:54.662 "name": null, 00:23:54.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:54.662 "is_configured": false, 00:23:54.662 "data_offset": 2048, 00:23:54.662 "data_size": 63488 00:23:54.662 }, 00:23:54.662 { 00:23:54.662 "name": "BaseBdev3", 00:23:54.662 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:54.662 "is_configured": true, 00:23:54.662 "data_offset": 2048, 00:23:54.662 "data_size": 63488 00:23:54.662 }, 00:23:54.662 { 00:23:54.662 "name": "BaseBdev4", 00:23:54.662 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:54.662 "is_configured": true, 00:23:54.662 "data_offset": 2048, 00:23:54.662 "data_size": 63488 00:23:54.662 } 00:23:54.662 ] 00:23:54.662 }' 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.662 [2024-07-15 07:59:39.205318] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=845 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.662 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.922 [2024-07-15 07:59:39.429649] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:54.922 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.922 "name": "raid_bdev1", 00:23:54.922 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:54.922 "strip_size_kb": 0, 00:23:54.922 "state": "online", 00:23:54.922 "raid_level": "raid1", 00:23:54.922 "superblock": true, 00:23:54.922 "num_base_bdevs": 4, 00:23:54.922 "num_base_bdevs_discovered": 3, 00:23:54.922 "num_base_bdevs_operational": 3, 00:23:54.922 "process": { 00:23:54.922 "type": "rebuild", 00:23:54.922 "target": "spare", 00:23:54.922 "progress": { 00:23:54.922 "blocks": 26624, 00:23:54.922 "percent": 41 00:23:54.922 } 00:23:54.922 }, 00:23:54.922 "base_bdevs_list": [ 00:23:54.922 { 00:23:54.922 "name": "spare", 00:23:54.922 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:23:54.922 "is_configured": true, 00:23:54.922 "data_offset": 2048, 00:23:54.922 "data_size": 63488 00:23:54.922 }, 00:23:54.922 { 00:23:54.922 "name": null, 00:23:54.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:54.922 "is_configured": false, 00:23:54.922 "data_offset": 2048, 00:23:54.922 "data_size": 63488 00:23:54.922 }, 00:23:54.922 { 00:23:54.922 "name": "BaseBdev3", 00:23:54.922 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:54.922 "is_configured": true, 00:23:54.922 "data_offset": 2048, 00:23:54.922 "data_size": 63488 00:23:54.922 }, 00:23:54.922 { 00:23:54.922 "name": "BaseBdev4", 00:23:54.922 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:54.922 "is_configured": true, 00:23:54.922 "data_offset": 2048, 00:23:54.922 "data_size": 63488 00:23:54.922 } 00:23:54.922 ] 00:23:54.922 }' 00:23:54.922 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.922 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:54.922 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.922 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:54.922 07:59:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:54.922 [2024-07-15 07:59:39.654437] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:55.182 [2024-07-15 07:59:39.761928] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:55.182 [2024-07-15 07:59:39.762197] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:55.442 [2024-07-15 07:59:40.127109] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:23:55.702 [2024-07-15 07:59:40.343058] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:55.961 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:55.961 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:55.961 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:55.962 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:55.962 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:55.962 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:55.962 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.962 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.222 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.222 "name": "raid_bdev1", 00:23:56.222 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:56.222 "strip_size_kb": 0, 00:23:56.222 "state": "online", 00:23:56.222 "raid_level": "raid1", 00:23:56.222 "superblock": true, 00:23:56.222 "num_base_bdevs": 4, 00:23:56.222 "num_base_bdevs_discovered": 3, 00:23:56.222 "num_base_bdevs_operational": 3, 00:23:56.222 "process": { 00:23:56.222 "type": "rebuild", 00:23:56.222 "target": "spare", 00:23:56.222 "progress": { 00:23:56.222 "blocks": 45056, 00:23:56.222 "percent": 70 00:23:56.222 } 00:23:56.222 }, 00:23:56.222 "base_bdevs_list": [ 00:23:56.222 { 00:23:56.222 "name": "spare", 00:23:56.222 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:23:56.222 "is_configured": true, 00:23:56.222 "data_offset": 2048, 00:23:56.222 "data_size": 63488 00:23:56.222 }, 00:23:56.222 { 00:23:56.222 "name": null, 00:23:56.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.222 "is_configured": false, 00:23:56.222 "data_offset": 2048, 00:23:56.222 "data_size": 63488 00:23:56.222 }, 00:23:56.222 { 00:23:56.222 "name": "BaseBdev3", 00:23:56.222 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:56.222 "is_configured": true, 00:23:56.222 "data_offset": 2048, 00:23:56.222 "data_size": 63488 00:23:56.222 }, 00:23:56.222 { 00:23:56.222 "name": "BaseBdev4", 00:23:56.222 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:56.222 "is_configured": true, 00:23:56.222 "data_offset": 2048, 00:23:56.222 "data_size": 63488 00:23:56.222 } 00:23:56.222 ] 00:23:56.222 }' 00:23:56.222 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.222 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:56.222 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.222 [2024-07-15 07:59:40.790264] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:23:56.222 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:56.222 07:59:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:56.482 [2024-07-15 07:59:41.013319] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:56.482 [2024-07-15 07:59:41.013564] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:56.482 [2024-07-15 07:59:41.215388] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:23:56.741 [2024-07-15 07:59:41.431025] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:23:57.316 07:59:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:57.316 07:59:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:57.316 07:59:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:57.316 07:59:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:57.316 07:59:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:57.316 07:59:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:57.316 07:59:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.316 07:59:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.316 [2024-07-15 07:59:41.863548] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:57.316 [2024-07-15 07:59:41.963767] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:57.316 [2024-07-15 07:59:41.965203] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:57.316 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:57.316 "name": "raid_bdev1", 00:23:57.316 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:57.316 "strip_size_kb": 0, 00:23:57.316 "state": "online", 00:23:57.316 "raid_level": "raid1", 00:23:57.316 "superblock": true, 00:23:57.316 "num_base_bdevs": 4, 00:23:57.316 "num_base_bdevs_discovered": 3, 00:23:57.316 "num_base_bdevs_operational": 3, 00:23:57.316 "base_bdevs_list": [ 00:23:57.316 { 00:23:57.316 "name": "spare", 00:23:57.316 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:23:57.316 "is_configured": true, 00:23:57.316 "data_offset": 2048, 00:23:57.316 "data_size": 63488 00:23:57.316 }, 00:23:57.316 { 00:23:57.316 "name": null, 00:23:57.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:57.316 "is_configured": false, 00:23:57.316 "data_offset": 2048, 00:23:57.316 "data_size": 63488 00:23:57.316 }, 00:23:57.316 { 00:23:57.316 "name": "BaseBdev3", 00:23:57.316 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:57.316 "is_configured": true, 00:23:57.316 "data_offset": 2048, 00:23:57.316 "data_size": 63488 00:23:57.316 }, 00:23:57.316 { 00:23:57.316 "name": "BaseBdev4", 00:23:57.316 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:57.316 "is_configured": true, 00:23:57.316 "data_offset": 2048, 00:23:57.316 "data_size": 63488 00:23:57.316 } 00:23:57.316 ] 00:23:57.316 }' 00:23:57.316 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:57.316 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:57.316 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:57.612 "name": "raid_bdev1", 00:23:57.612 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:57.612 "strip_size_kb": 0, 00:23:57.612 "state": "online", 00:23:57.612 "raid_level": "raid1", 00:23:57.612 "superblock": true, 00:23:57.612 "num_base_bdevs": 4, 00:23:57.612 "num_base_bdevs_discovered": 3, 00:23:57.612 "num_base_bdevs_operational": 3, 00:23:57.612 "base_bdevs_list": [ 00:23:57.612 { 00:23:57.612 "name": "spare", 00:23:57.612 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:23:57.612 "is_configured": true, 00:23:57.612 "data_offset": 2048, 00:23:57.612 "data_size": 63488 00:23:57.612 }, 00:23:57.612 { 00:23:57.612 "name": null, 00:23:57.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:57.612 "is_configured": false, 00:23:57.612 "data_offset": 2048, 00:23:57.612 "data_size": 63488 00:23:57.612 }, 00:23:57.612 { 00:23:57.612 "name": "BaseBdev3", 00:23:57.612 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:57.612 "is_configured": true, 00:23:57.612 "data_offset": 2048, 00:23:57.612 "data_size": 63488 00:23:57.612 }, 00:23:57.612 { 00:23:57.612 "name": "BaseBdev4", 00:23:57.612 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:57.612 "is_configured": true, 00:23:57.612 "data_offset": 2048, 00:23:57.612 "data_size": 63488 00:23:57.612 } 00:23:57.612 ] 00:23:57.612 }' 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:57.612 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:57.872 "name": "raid_bdev1", 00:23:57.872 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:23:57.872 "strip_size_kb": 0, 00:23:57.872 "state": "online", 00:23:57.872 "raid_level": "raid1", 00:23:57.872 "superblock": true, 00:23:57.872 "num_base_bdevs": 4, 00:23:57.872 "num_base_bdevs_discovered": 3, 00:23:57.872 "num_base_bdevs_operational": 3, 00:23:57.872 "base_bdevs_list": [ 00:23:57.872 { 00:23:57.872 "name": "spare", 00:23:57.872 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:23:57.872 "is_configured": true, 00:23:57.872 "data_offset": 2048, 00:23:57.872 "data_size": 63488 00:23:57.872 }, 00:23:57.872 { 00:23:57.872 "name": null, 00:23:57.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:57.872 "is_configured": false, 00:23:57.872 "data_offset": 2048, 00:23:57.872 "data_size": 63488 00:23:57.872 }, 00:23:57.872 { 00:23:57.872 "name": "BaseBdev3", 00:23:57.872 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:23:57.872 "is_configured": true, 00:23:57.872 "data_offset": 2048, 00:23:57.872 "data_size": 63488 00:23:57.872 }, 00:23:57.872 { 00:23:57.872 "name": "BaseBdev4", 00:23:57.872 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:23:57.872 "is_configured": true, 00:23:57.872 "data_offset": 2048, 00:23:57.872 "data_size": 63488 00:23:57.872 } 00:23:57.872 ] 00:23:57.872 }' 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:57.872 07:59:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:58.441 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:58.700 [2024-07-15 07:59:43.301343] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:58.700 [2024-07-15 07:59:43.301367] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:58.700 00:23:58.700 Latency(us) 00:23:58.700 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:58.700 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:58.700 raid_bdev1 : 10.92 105.16 315.49 0.00 0.00 12268.15 252.06 115343.36 00:23:58.700 =================================================================================================================== 00:23:58.700 Total : 105.16 315.49 0.00 0.00 12268.15 252.06 115343.36 00:23:58.700 [2024-07-15 07:59:43.376755] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:58.700 [2024-07-15 07:59:43.376784] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:58.700 [2024-07-15 07:59:43.376862] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:58.700 [2024-07-15 07:59:43.376870] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169eb30 name raid_bdev1, state offline 00:23:58.700 0 00:23:58.700 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.700 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:58.960 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:59.220 /dev/nbd0 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:59.220 1+0 records in 00:23:59.220 1+0 records out 00:23:59.220 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242352 s, 16.9 MB/s 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:59.220 07:59:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:23:59.480 /dev/nbd1 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:59.480 1+0 records in 00:23:59.480 1+0 records out 00:23:59.480 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270507 s, 15.1 MB/s 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:59.480 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:59.741 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:24:00.002 /dev/nbd1 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:00.002 1+0 records in 00:24:00.002 1+0 records out 00:24:00.002 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000197431 s, 20.7 MB/s 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:00.002 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:00.262 07:59:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:00.262 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:00.522 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:00.783 [2024-07-15 07:59:45.381467] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:00.783 [2024-07-15 07:59:45.381502] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:00.783 [2024-07-15 07:59:45.381520] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x169e830 00:24:00.783 [2024-07-15 07:59:45.381527] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:00.783 [2024-07-15 07:59:45.382888] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:00.783 [2024-07-15 07:59:45.382910] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:00.783 [2024-07-15 07:59:45.382975] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:00.783 [2024-07-15 07:59:45.382996] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:00.783 [2024-07-15 07:59:45.383078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:00.783 [2024-07-15 07:59:45.383134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:00.783 spare 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.783 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.783 [2024-07-15 07:59:45.483427] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16a1570 00:24:00.783 [2024-07-15 07:59:45.483437] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:00.783 [2024-07-15 07:59:45.483607] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16ba3f0 00:24:00.783 [2024-07-15 07:59:45.483733] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16a1570 00:24:00.783 [2024-07-15 07:59:45.483739] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16a1570 00:24:00.783 [2024-07-15 07:59:45.483823] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:01.043 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.043 "name": "raid_bdev1", 00:24:01.043 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:01.043 "strip_size_kb": 0, 00:24:01.043 "state": "online", 00:24:01.043 "raid_level": "raid1", 00:24:01.043 "superblock": true, 00:24:01.043 "num_base_bdevs": 4, 00:24:01.043 "num_base_bdevs_discovered": 3, 00:24:01.043 "num_base_bdevs_operational": 3, 00:24:01.043 "base_bdevs_list": [ 00:24:01.043 { 00:24:01.043 "name": "spare", 00:24:01.043 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:24:01.043 "is_configured": true, 00:24:01.043 "data_offset": 2048, 00:24:01.043 "data_size": 63488 00:24:01.043 }, 00:24:01.043 { 00:24:01.043 "name": null, 00:24:01.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.043 "is_configured": false, 00:24:01.043 "data_offset": 2048, 00:24:01.043 "data_size": 63488 00:24:01.043 }, 00:24:01.043 { 00:24:01.043 "name": "BaseBdev3", 00:24:01.043 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:01.043 "is_configured": true, 00:24:01.043 "data_offset": 2048, 00:24:01.043 "data_size": 63488 00:24:01.043 }, 00:24:01.043 { 00:24:01.043 "name": "BaseBdev4", 00:24:01.043 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:01.043 "is_configured": true, 00:24:01.043 "data_offset": 2048, 00:24:01.043 "data_size": 63488 00:24:01.043 } 00:24:01.043 ] 00:24:01.043 }' 00:24:01.043 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.043 07:59:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:01.611 "name": "raid_bdev1", 00:24:01.611 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:01.611 "strip_size_kb": 0, 00:24:01.611 "state": "online", 00:24:01.611 "raid_level": "raid1", 00:24:01.611 "superblock": true, 00:24:01.611 "num_base_bdevs": 4, 00:24:01.611 "num_base_bdevs_discovered": 3, 00:24:01.611 "num_base_bdevs_operational": 3, 00:24:01.611 "base_bdevs_list": [ 00:24:01.611 { 00:24:01.611 "name": "spare", 00:24:01.611 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:24:01.611 "is_configured": true, 00:24:01.611 "data_offset": 2048, 00:24:01.611 "data_size": 63488 00:24:01.611 }, 00:24:01.611 { 00:24:01.611 "name": null, 00:24:01.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.611 "is_configured": false, 00:24:01.611 "data_offset": 2048, 00:24:01.611 "data_size": 63488 00:24:01.611 }, 00:24:01.611 { 00:24:01.611 "name": "BaseBdev3", 00:24:01.611 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:01.611 "is_configured": true, 00:24:01.611 "data_offset": 2048, 00:24:01.611 "data_size": 63488 00:24:01.611 }, 00:24:01.611 { 00:24:01.611 "name": "BaseBdev4", 00:24:01.611 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:01.611 "is_configured": true, 00:24:01.611 "data_offset": 2048, 00:24:01.611 "data_size": 63488 00:24:01.611 } 00:24:01.611 ] 00:24:01.611 }' 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:01.611 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:01.871 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:01.871 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.871 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:01.871 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:01.871 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:02.130 [2024-07-15 07:59:46.761185] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.130 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:02.389 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:02.389 "name": "raid_bdev1", 00:24:02.389 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:02.389 "strip_size_kb": 0, 00:24:02.389 "state": "online", 00:24:02.389 "raid_level": "raid1", 00:24:02.389 "superblock": true, 00:24:02.389 "num_base_bdevs": 4, 00:24:02.389 "num_base_bdevs_discovered": 2, 00:24:02.389 "num_base_bdevs_operational": 2, 00:24:02.389 "base_bdevs_list": [ 00:24:02.389 { 00:24:02.389 "name": null, 00:24:02.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.389 "is_configured": false, 00:24:02.389 "data_offset": 2048, 00:24:02.389 "data_size": 63488 00:24:02.389 }, 00:24:02.389 { 00:24:02.389 "name": null, 00:24:02.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:02.389 "is_configured": false, 00:24:02.389 "data_offset": 2048, 00:24:02.389 "data_size": 63488 00:24:02.389 }, 00:24:02.389 { 00:24:02.389 "name": "BaseBdev3", 00:24:02.389 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:02.389 "is_configured": true, 00:24:02.389 "data_offset": 2048, 00:24:02.389 "data_size": 63488 00:24:02.389 }, 00:24:02.389 { 00:24:02.389 "name": "BaseBdev4", 00:24:02.389 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:02.389 "is_configured": true, 00:24:02.389 "data_offset": 2048, 00:24:02.389 "data_size": 63488 00:24:02.389 } 00:24:02.389 ] 00:24:02.389 }' 00:24:02.389 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:02.389 07:59:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:02.957 07:59:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:02.957 [2024-07-15 07:59:47.687627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:02.957 [2024-07-15 07:59:47.687747] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:02.957 [2024-07-15 07:59:47.687757] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:02.957 [2024-07-15 07:59:47.687774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:02.957 [2024-07-15 07:59:47.690836] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16ba470 00:24:02.957 [2024-07-15 07:59:47.692410] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:03.216 07:59:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:04.157 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:04.157 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:04.157 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:04.157 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:04.157 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:04.157 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.157 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.417 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:04.417 "name": "raid_bdev1", 00:24:04.417 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:04.417 "strip_size_kb": 0, 00:24:04.417 "state": "online", 00:24:04.417 "raid_level": "raid1", 00:24:04.417 "superblock": true, 00:24:04.417 "num_base_bdevs": 4, 00:24:04.417 "num_base_bdevs_discovered": 3, 00:24:04.417 "num_base_bdevs_operational": 3, 00:24:04.417 "process": { 00:24:04.417 "type": "rebuild", 00:24:04.417 "target": "spare", 00:24:04.417 "progress": { 00:24:04.417 "blocks": 24576, 00:24:04.417 "percent": 38 00:24:04.417 } 00:24:04.417 }, 00:24:04.417 "base_bdevs_list": [ 00:24:04.417 { 00:24:04.417 "name": "spare", 00:24:04.417 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:24:04.417 "is_configured": true, 00:24:04.417 "data_offset": 2048, 00:24:04.417 "data_size": 63488 00:24:04.417 }, 00:24:04.417 { 00:24:04.417 "name": null, 00:24:04.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.417 "is_configured": false, 00:24:04.417 "data_offset": 2048, 00:24:04.417 "data_size": 63488 00:24:04.417 }, 00:24:04.417 { 00:24:04.417 "name": "BaseBdev3", 00:24:04.417 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:04.417 "is_configured": true, 00:24:04.417 "data_offset": 2048, 00:24:04.417 "data_size": 63488 00:24:04.417 }, 00:24:04.417 { 00:24:04.417 "name": "BaseBdev4", 00:24:04.417 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:04.417 "is_configured": true, 00:24:04.417 "data_offset": 2048, 00:24:04.417 "data_size": 63488 00:24:04.417 } 00:24:04.417 ] 00:24:04.417 }' 00:24:04.417 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:04.417 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:04.417 07:59:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:04.417 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:04.417 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:04.677 [2024-07-15 07:59:49.193351] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:04.677 [2024-07-15 07:59:49.201312] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:04.677 [2024-07-15 07:59:49.201345] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:04.677 [2024-07-15 07:59:49.201355] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:04.677 [2024-07-15 07:59:49.201359] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:04.677 "name": "raid_bdev1", 00:24:04.677 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:04.677 "strip_size_kb": 0, 00:24:04.677 "state": "online", 00:24:04.677 "raid_level": "raid1", 00:24:04.677 "superblock": true, 00:24:04.677 "num_base_bdevs": 4, 00:24:04.677 "num_base_bdevs_discovered": 2, 00:24:04.677 "num_base_bdevs_operational": 2, 00:24:04.677 "base_bdevs_list": [ 00:24:04.677 { 00:24:04.677 "name": null, 00:24:04.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.677 "is_configured": false, 00:24:04.677 "data_offset": 2048, 00:24:04.677 "data_size": 63488 00:24:04.677 }, 00:24:04.677 { 00:24:04.677 "name": null, 00:24:04.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.677 "is_configured": false, 00:24:04.677 "data_offset": 2048, 00:24:04.677 "data_size": 63488 00:24:04.677 }, 00:24:04.677 { 00:24:04.677 "name": "BaseBdev3", 00:24:04.677 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:04.677 "is_configured": true, 00:24:04.677 "data_offset": 2048, 00:24:04.677 "data_size": 63488 00:24:04.677 }, 00:24:04.677 { 00:24:04.677 "name": "BaseBdev4", 00:24:04.677 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:04.677 "is_configured": true, 00:24:04.677 "data_offset": 2048, 00:24:04.677 "data_size": 63488 00:24:04.677 } 00:24:04.677 ] 00:24:04.677 }' 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:04.677 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:05.247 07:59:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:05.507 [2024-07-15 07:59:50.083689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:05.507 [2024-07-15 07:59:50.083734] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:05.507 [2024-07-15 07:59:50.083748] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16ba270 00:24:05.507 [2024-07-15 07:59:50.083755] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:05.507 [2024-07-15 07:59:50.084072] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:05.508 [2024-07-15 07:59:50.084084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:05.508 [2024-07-15 07:59:50.084144] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:05.508 [2024-07-15 07:59:50.084151] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:05.508 [2024-07-15 07:59:50.084157] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:05.508 [2024-07-15 07:59:50.084168] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:05.508 [2024-07-15 07:59:50.087082] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16a5b50 00:24:05.508 [2024-07-15 07:59:50.088231] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:05.508 spare 00:24:05.508 07:59:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:06.450 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:06.450 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.450 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:06.450 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:06.450 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.450 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.450 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.712 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.712 "name": "raid_bdev1", 00:24:06.712 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:06.712 "strip_size_kb": 0, 00:24:06.712 "state": "online", 00:24:06.712 "raid_level": "raid1", 00:24:06.712 "superblock": true, 00:24:06.712 "num_base_bdevs": 4, 00:24:06.712 "num_base_bdevs_discovered": 3, 00:24:06.712 "num_base_bdevs_operational": 3, 00:24:06.712 "process": { 00:24:06.712 "type": "rebuild", 00:24:06.712 "target": "spare", 00:24:06.712 "progress": { 00:24:06.712 "blocks": 22528, 00:24:06.712 "percent": 35 00:24:06.712 } 00:24:06.712 }, 00:24:06.712 "base_bdevs_list": [ 00:24:06.712 { 00:24:06.712 "name": "spare", 00:24:06.712 "uuid": "b7e7c789-b9cc-5fe5-8f69-642e481c4227", 00:24:06.712 "is_configured": true, 00:24:06.712 "data_offset": 2048, 00:24:06.712 "data_size": 63488 00:24:06.712 }, 00:24:06.712 { 00:24:06.712 "name": null, 00:24:06.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.712 "is_configured": false, 00:24:06.712 "data_offset": 2048, 00:24:06.712 "data_size": 63488 00:24:06.712 }, 00:24:06.712 { 00:24:06.712 "name": "BaseBdev3", 00:24:06.712 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:06.712 "is_configured": true, 00:24:06.712 "data_offset": 2048, 00:24:06.712 "data_size": 63488 00:24:06.712 }, 00:24:06.712 { 00:24:06.712 "name": "BaseBdev4", 00:24:06.712 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:06.712 "is_configured": true, 00:24:06.712 "data_offset": 2048, 00:24:06.712 "data_size": 63488 00:24:06.712 } 00:24:06.712 ] 00:24:06.712 }' 00:24:06.712 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.712 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:06.712 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.712 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:06.712 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:06.973 [2024-07-15 07:59:51.577149] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.973 [2024-07-15 07:59:51.597149] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:06.973 [2024-07-15 07:59:51.597181] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:06.973 [2024-07-15 07:59:51.597192] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.973 [2024-07-15 07:59:51.597196] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.973 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.233 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:07.233 "name": "raid_bdev1", 00:24:07.233 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:07.233 "strip_size_kb": 0, 00:24:07.233 "state": "online", 00:24:07.233 "raid_level": "raid1", 00:24:07.233 "superblock": true, 00:24:07.233 "num_base_bdevs": 4, 00:24:07.233 "num_base_bdevs_discovered": 2, 00:24:07.234 "num_base_bdevs_operational": 2, 00:24:07.234 "base_bdevs_list": [ 00:24:07.234 { 00:24:07.234 "name": null, 00:24:07.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.234 "is_configured": false, 00:24:07.234 "data_offset": 2048, 00:24:07.234 "data_size": 63488 00:24:07.234 }, 00:24:07.234 { 00:24:07.234 "name": null, 00:24:07.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.234 "is_configured": false, 00:24:07.234 "data_offset": 2048, 00:24:07.234 "data_size": 63488 00:24:07.234 }, 00:24:07.234 { 00:24:07.234 "name": "BaseBdev3", 00:24:07.234 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:07.234 "is_configured": true, 00:24:07.234 "data_offset": 2048, 00:24:07.234 "data_size": 63488 00:24:07.234 }, 00:24:07.234 { 00:24:07.234 "name": "BaseBdev4", 00:24:07.234 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:07.234 "is_configured": true, 00:24:07.234 "data_offset": 2048, 00:24:07.234 "data_size": 63488 00:24:07.234 } 00:24:07.234 ] 00:24:07.234 }' 00:24:07.234 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:07.234 07:59:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:07.804 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:07.804 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.804 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:07.804 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:07.804 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.805 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.805 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.805 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.805 "name": "raid_bdev1", 00:24:07.805 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:07.805 "strip_size_kb": 0, 00:24:07.805 "state": "online", 00:24:07.805 "raid_level": "raid1", 00:24:07.805 "superblock": true, 00:24:07.805 "num_base_bdevs": 4, 00:24:07.805 "num_base_bdevs_discovered": 2, 00:24:07.805 "num_base_bdevs_operational": 2, 00:24:07.805 "base_bdevs_list": [ 00:24:07.805 { 00:24:07.805 "name": null, 00:24:07.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.805 "is_configured": false, 00:24:07.805 "data_offset": 2048, 00:24:07.805 "data_size": 63488 00:24:07.805 }, 00:24:07.805 { 00:24:07.805 "name": null, 00:24:07.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.805 "is_configured": false, 00:24:07.805 "data_offset": 2048, 00:24:07.805 "data_size": 63488 00:24:07.805 }, 00:24:07.805 { 00:24:07.805 "name": "BaseBdev3", 00:24:07.805 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:07.805 "is_configured": true, 00:24:07.805 "data_offset": 2048, 00:24:07.805 "data_size": 63488 00:24:07.805 }, 00:24:07.805 { 00:24:07.805 "name": "BaseBdev4", 00:24:07.805 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:07.805 "is_configured": true, 00:24:07.805 "data_offset": 2048, 00:24:07.805 "data_size": 63488 00:24:07.805 } 00:24:07.805 ] 00:24:07.805 }' 00:24:07.805 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:08.066 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:08.066 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:08.066 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:08.066 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:08.066 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:08.327 [2024-07-15 07:59:52.968823] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:08.327 [2024-07-15 07:59:52.968856] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:08.327 [2024-07-15 07:59:52.968868] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a4f60 00:24:08.327 [2024-07-15 07:59:52.968874] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:08.327 [2024-07-15 07:59:52.969158] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:08.327 [2024-07-15 07:59:52.969170] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:08.327 [2024-07-15 07:59:52.969215] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:08.327 [2024-07-15 07:59:52.969222] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:08.327 [2024-07-15 07:59:52.969229] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:08.327 BaseBdev1 00:24:08.327 07:59:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.268 07:59:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.528 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.528 "name": "raid_bdev1", 00:24:09.528 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:09.528 "strip_size_kb": 0, 00:24:09.528 "state": "online", 00:24:09.528 "raid_level": "raid1", 00:24:09.528 "superblock": true, 00:24:09.528 "num_base_bdevs": 4, 00:24:09.528 "num_base_bdevs_discovered": 2, 00:24:09.528 "num_base_bdevs_operational": 2, 00:24:09.528 "base_bdevs_list": [ 00:24:09.528 { 00:24:09.528 "name": null, 00:24:09.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.528 "is_configured": false, 00:24:09.528 "data_offset": 2048, 00:24:09.528 "data_size": 63488 00:24:09.528 }, 00:24:09.528 { 00:24:09.528 "name": null, 00:24:09.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.528 "is_configured": false, 00:24:09.528 "data_offset": 2048, 00:24:09.528 "data_size": 63488 00:24:09.528 }, 00:24:09.528 { 00:24:09.528 "name": "BaseBdev3", 00:24:09.528 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:09.528 "is_configured": true, 00:24:09.528 "data_offset": 2048, 00:24:09.528 "data_size": 63488 00:24:09.528 }, 00:24:09.528 { 00:24:09.528 "name": "BaseBdev4", 00:24:09.528 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:09.528 "is_configured": true, 00:24:09.528 "data_offset": 2048, 00:24:09.528 "data_size": 63488 00:24:09.528 } 00:24:09.528 ] 00:24:09.528 }' 00:24:09.528 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.528 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:10.098 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:10.098 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.098 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:10.098 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:10.098 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.098 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.098 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.357 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.357 "name": "raid_bdev1", 00:24:10.357 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:10.357 "strip_size_kb": 0, 00:24:10.357 "state": "online", 00:24:10.357 "raid_level": "raid1", 00:24:10.357 "superblock": true, 00:24:10.357 "num_base_bdevs": 4, 00:24:10.357 "num_base_bdevs_discovered": 2, 00:24:10.357 "num_base_bdevs_operational": 2, 00:24:10.357 "base_bdevs_list": [ 00:24:10.358 { 00:24:10.358 "name": null, 00:24:10.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.358 "is_configured": false, 00:24:10.358 "data_offset": 2048, 00:24:10.358 "data_size": 63488 00:24:10.358 }, 00:24:10.358 { 00:24:10.358 "name": null, 00:24:10.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.358 "is_configured": false, 00:24:10.358 "data_offset": 2048, 00:24:10.358 "data_size": 63488 00:24:10.358 }, 00:24:10.358 { 00:24:10.358 "name": "BaseBdev3", 00:24:10.358 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:10.358 "is_configured": true, 00:24:10.358 "data_offset": 2048, 00:24:10.358 "data_size": 63488 00:24:10.358 }, 00:24:10.358 { 00:24:10.358 "name": "BaseBdev4", 00:24:10.358 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:10.358 "is_configured": true, 00:24:10.358 "data_offset": 2048, 00:24:10.358 "data_size": 63488 00:24:10.358 } 00:24:10.358 ] 00:24:10.358 }' 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:10.358 07:59:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:10.618 [2024-07-15 07:59:55.158637] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:10.618 [2024-07-15 07:59:55.158738] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:10.618 [2024-07-15 07:59:55.158748] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:10.618 request: 00:24:10.618 { 00:24:10.618 "base_bdev": "BaseBdev1", 00:24:10.618 "raid_bdev": "raid_bdev1", 00:24:10.618 "method": "bdev_raid_add_base_bdev", 00:24:10.618 "req_id": 1 00:24:10.618 } 00:24:10.618 Got JSON-RPC error response 00:24:10.618 response: 00:24:10.618 { 00:24:10.618 "code": -22, 00:24:10.618 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:10.618 } 00:24:10.618 07:59:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:24:10.618 07:59:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:10.618 07:59:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:10.618 07:59:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:10.618 07:59:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.558 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.819 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:11.819 "name": "raid_bdev1", 00:24:11.819 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:11.819 "strip_size_kb": 0, 00:24:11.819 "state": "online", 00:24:11.819 "raid_level": "raid1", 00:24:11.819 "superblock": true, 00:24:11.819 "num_base_bdevs": 4, 00:24:11.819 "num_base_bdevs_discovered": 2, 00:24:11.819 "num_base_bdevs_operational": 2, 00:24:11.819 "base_bdevs_list": [ 00:24:11.819 { 00:24:11.819 "name": null, 00:24:11.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.819 "is_configured": false, 00:24:11.819 "data_offset": 2048, 00:24:11.819 "data_size": 63488 00:24:11.819 }, 00:24:11.819 { 00:24:11.819 "name": null, 00:24:11.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.819 "is_configured": false, 00:24:11.819 "data_offset": 2048, 00:24:11.819 "data_size": 63488 00:24:11.819 }, 00:24:11.819 { 00:24:11.819 "name": "BaseBdev3", 00:24:11.819 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:11.819 "is_configured": true, 00:24:11.819 "data_offset": 2048, 00:24:11.819 "data_size": 63488 00:24:11.819 }, 00:24:11.819 { 00:24:11.819 "name": "BaseBdev4", 00:24:11.819 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:11.819 "is_configured": true, 00:24:11.819 "data_offset": 2048, 00:24:11.819 "data_size": 63488 00:24:11.819 } 00:24:11.819 ] 00:24:11.819 }' 00:24:11.819 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:11.819 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:12.391 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:12.391 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.391 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:12.391 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:12.391 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.391 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.391 07:59:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.391 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.391 "name": "raid_bdev1", 00:24:12.391 "uuid": "fcf9c29c-b8f9-499f-a161-f692acf38335", 00:24:12.391 "strip_size_kb": 0, 00:24:12.391 "state": "online", 00:24:12.391 "raid_level": "raid1", 00:24:12.391 "superblock": true, 00:24:12.391 "num_base_bdevs": 4, 00:24:12.391 "num_base_bdevs_discovered": 2, 00:24:12.391 "num_base_bdevs_operational": 2, 00:24:12.391 "base_bdevs_list": [ 00:24:12.391 { 00:24:12.391 "name": null, 00:24:12.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.391 "is_configured": false, 00:24:12.391 "data_offset": 2048, 00:24:12.391 "data_size": 63488 00:24:12.391 }, 00:24:12.391 { 00:24:12.391 "name": null, 00:24:12.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.391 "is_configured": false, 00:24:12.391 "data_offset": 2048, 00:24:12.391 "data_size": 63488 00:24:12.391 }, 00:24:12.391 { 00:24:12.391 "name": "BaseBdev3", 00:24:12.391 "uuid": "74f741a5-f937-5899-ae24-dbb9c4303a51", 00:24:12.391 "is_configured": true, 00:24:12.391 "data_offset": 2048, 00:24:12.391 "data_size": 63488 00:24:12.391 }, 00:24:12.391 { 00:24:12.391 "name": "BaseBdev4", 00:24:12.391 "uuid": "8ae172b2-469c-5fc0-b882-77fe3565d350", 00:24:12.391 "is_configured": true, 00:24:12.391 "data_offset": 2048, 00:24:12.391 "data_size": 63488 00:24:12.391 } 00:24:12.391 ] 00:24:12.391 }' 00:24:12.391 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.391 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:12.391 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1736050 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1736050 ']' 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1736050 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1736050 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1736050' 00:24:12.652 killing process with pid 1736050 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1736050 00:24:12.652 Received shutdown signal, test time was about 24.754663 seconds 00:24:12.652 00:24:12.652 Latency(us) 00:24:12.652 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:12.652 =================================================================================================================== 00:24:12.652 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:12.652 [2024-07-15 07:59:57.244197] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:12.652 [2024-07-15 07:59:57.244274] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:12.652 [2024-07-15 07:59:57.244316] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:12.652 [2024-07-15 07:59:57.244323] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16a1570 name raid_bdev1, state offline 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1736050 00:24:12.652 [2024-07-15 07:59:57.267424] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:12.652 00:24:12.652 real 0m29.313s 00:24:12.652 user 0m46.646s 00:24:12.652 sys 0m3.494s 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:12.652 07:59:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:12.652 ************************************ 00:24:12.652 END TEST raid_rebuild_test_sb_io 00:24:12.652 ************************************ 00:24:12.912 07:59:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:12.912 07:59:57 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:24:12.912 07:59:57 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:24:12.913 07:59:57 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:24:12.913 07:59:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:12.913 07:59:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:12.913 07:59:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:12.913 ************************************ 00:24:12.913 START TEST raid_state_function_test_sb_4k 00:24:12.913 ************************************ 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1741417 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1741417' 00:24:12.913 Process raid pid: 1741417 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1741417 /var/tmp/spdk-raid.sock 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1741417 ']' 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:12.913 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:12.913 07:59:57 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:12.913 [2024-07-15 07:59:57.534135] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:24:12.913 [2024-07-15 07:59:57.534181] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:12.913 [2024-07-15 07:59:57.621603] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:13.174 [2024-07-15 07:59:57.686631] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:13.174 [2024-07-15 07:59:57.733158] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:13.174 [2024-07-15 07:59:57.733182] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:13.744 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:13.744 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:24:13.744 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:14.004 [2024-07-15 07:59:58.544715] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:14.004 [2024-07-15 07:59:58.544744] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:14.004 [2024-07-15 07:59:58.544750] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:14.004 [2024-07-15 07:59:58.544760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:14.004 "name": "Existed_Raid", 00:24:14.004 "uuid": "fe8be92b-1df0-4eef-85c9-be342d275a07", 00:24:14.004 "strip_size_kb": 0, 00:24:14.004 "state": "configuring", 00:24:14.004 "raid_level": "raid1", 00:24:14.004 "superblock": true, 00:24:14.004 "num_base_bdevs": 2, 00:24:14.004 "num_base_bdevs_discovered": 0, 00:24:14.004 "num_base_bdevs_operational": 2, 00:24:14.004 "base_bdevs_list": [ 00:24:14.004 { 00:24:14.004 "name": "BaseBdev1", 00:24:14.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:14.004 "is_configured": false, 00:24:14.004 "data_offset": 0, 00:24:14.004 "data_size": 0 00:24:14.004 }, 00:24:14.004 { 00:24:14.004 "name": "BaseBdev2", 00:24:14.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:14.004 "is_configured": false, 00:24:14.004 "data_offset": 0, 00:24:14.004 "data_size": 0 00:24:14.004 } 00:24:14.004 ] 00:24:14.004 }' 00:24:14.004 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:14.005 07:59:58 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:14.577 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:14.868 [2024-07-15 07:59:59.482983] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:14.868 [2024-07-15 07:59:59.482998] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd5d6b0 name Existed_Raid, state configuring 00:24:14.868 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:15.128 [2024-07-15 07:59:59.671476] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:15.128 [2024-07-15 07:59:59.671493] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:15.128 [2024-07-15 07:59:59.671499] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:15.128 [2024-07-15 07:59:59.671504] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:15.128 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:24:15.128 [2024-07-15 07:59:59.854605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:15.128 BaseBdev1 00:24:15.128 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:15.128 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:24:15.128 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:15.128 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:24:15.128 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:15.128 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:15.128 07:59:59 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:15.389 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:15.648 [ 00:24:15.648 { 00:24:15.648 "name": "BaseBdev1", 00:24:15.648 "aliases": [ 00:24:15.648 "15ad3de8-7308-4bf5-a17e-da2a12ec2ca8" 00:24:15.648 ], 00:24:15.648 "product_name": "Malloc disk", 00:24:15.648 "block_size": 4096, 00:24:15.648 "num_blocks": 8192, 00:24:15.648 "uuid": "15ad3de8-7308-4bf5-a17e-da2a12ec2ca8", 00:24:15.648 "assigned_rate_limits": { 00:24:15.648 "rw_ios_per_sec": 0, 00:24:15.648 "rw_mbytes_per_sec": 0, 00:24:15.648 "r_mbytes_per_sec": 0, 00:24:15.648 "w_mbytes_per_sec": 0 00:24:15.648 }, 00:24:15.648 "claimed": true, 00:24:15.648 "claim_type": "exclusive_write", 00:24:15.648 "zoned": false, 00:24:15.648 "supported_io_types": { 00:24:15.648 "read": true, 00:24:15.648 "write": true, 00:24:15.648 "unmap": true, 00:24:15.648 "flush": true, 00:24:15.648 "reset": true, 00:24:15.648 "nvme_admin": false, 00:24:15.648 "nvme_io": false, 00:24:15.648 "nvme_io_md": false, 00:24:15.648 "write_zeroes": true, 00:24:15.648 "zcopy": true, 00:24:15.648 "get_zone_info": false, 00:24:15.648 "zone_management": false, 00:24:15.648 "zone_append": false, 00:24:15.648 "compare": false, 00:24:15.648 "compare_and_write": false, 00:24:15.648 "abort": true, 00:24:15.648 "seek_hole": false, 00:24:15.648 "seek_data": false, 00:24:15.648 "copy": true, 00:24:15.648 "nvme_iov_md": false 00:24:15.648 }, 00:24:15.648 "memory_domains": [ 00:24:15.648 { 00:24:15.648 "dma_device_id": "system", 00:24:15.648 "dma_device_type": 1 00:24:15.648 }, 00:24:15.648 { 00:24:15.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:15.648 "dma_device_type": 2 00:24:15.648 } 00:24:15.648 ], 00:24:15.648 "driver_specific": {} 00:24:15.648 } 00:24:15.648 ] 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.648 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:15.908 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:15.908 "name": "Existed_Raid", 00:24:15.908 "uuid": "f383f32e-01e2-48d2-b662-7d7e61229f02", 00:24:15.908 "strip_size_kb": 0, 00:24:15.908 "state": "configuring", 00:24:15.908 "raid_level": "raid1", 00:24:15.908 "superblock": true, 00:24:15.908 "num_base_bdevs": 2, 00:24:15.908 "num_base_bdevs_discovered": 1, 00:24:15.908 "num_base_bdevs_operational": 2, 00:24:15.908 "base_bdevs_list": [ 00:24:15.908 { 00:24:15.908 "name": "BaseBdev1", 00:24:15.908 "uuid": "15ad3de8-7308-4bf5-a17e-da2a12ec2ca8", 00:24:15.908 "is_configured": true, 00:24:15.908 "data_offset": 256, 00:24:15.908 "data_size": 7936 00:24:15.908 }, 00:24:15.908 { 00:24:15.908 "name": "BaseBdev2", 00:24:15.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:15.908 "is_configured": false, 00:24:15.908 "data_offset": 0, 00:24:15.908 "data_size": 0 00:24:15.908 } 00:24:15.908 ] 00:24:15.908 }' 00:24:15.908 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:15.908 08:00:00 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:16.476 08:00:01 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:17.046 [2024-07-15 08:00:01.575266] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:17.046 [2024-07-15 08:00:01.575295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd5cfa0 name Existed_Raid, state configuring 00:24:17.046 08:00:01 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:17.612 [2024-07-15 08:00:02.116648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:17.612 [2024-07-15 08:00:02.117789] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:17.612 [2024-07-15 08:00:02.117813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:17.612 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.612 "name": "Existed_Raid", 00:24:17.612 "uuid": "6212c24b-21c8-4912-9c3d-a2d3957c6a6d", 00:24:17.612 "strip_size_kb": 0, 00:24:17.612 "state": "configuring", 00:24:17.612 "raid_level": "raid1", 00:24:17.612 "superblock": true, 00:24:17.612 "num_base_bdevs": 2, 00:24:17.612 "num_base_bdevs_discovered": 1, 00:24:17.612 "num_base_bdevs_operational": 2, 00:24:17.612 "base_bdevs_list": [ 00:24:17.612 { 00:24:17.612 "name": "BaseBdev1", 00:24:17.612 "uuid": "15ad3de8-7308-4bf5-a17e-da2a12ec2ca8", 00:24:17.612 "is_configured": true, 00:24:17.612 "data_offset": 256, 00:24:17.612 "data_size": 7936 00:24:17.612 }, 00:24:17.612 { 00:24:17.612 "name": "BaseBdev2", 00:24:17.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.612 "is_configured": false, 00:24:17.612 "data_offset": 0, 00:24:17.612 "data_size": 0 00:24:17.613 } 00:24:17.613 ] 00:24:17.613 }' 00:24:17.613 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.613 08:00:02 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:18.549 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:24:18.809 [2024-07-15 08:00:03.388874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:18.809 [2024-07-15 08:00:03.388979] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd5dd90 00:24:18.809 [2024-07-15 08:00:03.388987] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:18.809 [2024-07-15 08:00:03.389124] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf118d0 00:24:18.809 [2024-07-15 08:00:03.389215] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd5dd90 00:24:18.809 [2024-07-15 08:00:03.389221] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd5dd90 00:24:18.809 [2024-07-15 08:00:03.389288] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:18.809 BaseBdev2 00:24:18.809 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:24:18.809 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:24:18.809 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:24:18.809 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:24:18.809 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:24:18.809 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:24:18.809 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:19.377 08:00:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:19.377 [ 00:24:19.377 { 00:24:19.377 "name": "BaseBdev2", 00:24:19.377 "aliases": [ 00:24:19.377 "8042c4ff-3441-48e7-9ba7-84c7ec2e1fb8" 00:24:19.377 ], 00:24:19.377 "product_name": "Malloc disk", 00:24:19.377 "block_size": 4096, 00:24:19.377 "num_blocks": 8192, 00:24:19.377 "uuid": "8042c4ff-3441-48e7-9ba7-84c7ec2e1fb8", 00:24:19.377 "assigned_rate_limits": { 00:24:19.377 "rw_ios_per_sec": 0, 00:24:19.377 "rw_mbytes_per_sec": 0, 00:24:19.377 "r_mbytes_per_sec": 0, 00:24:19.377 "w_mbytes_per_sec": 0 00:24:19.377 }, 00:24:19.377 "claimed": true, 00:24:19.377 "claim_type": "exclusive_write", 00:24:19.377 "zoned": false, 00:24:19.377 "supported_io_types": { 00:24:19.377 "read": true, 00:24:19.377 "write": true, 00:24:19.377 "unmap": true, 00:24:19.377 "flush": true, 00:24:19.377 "reset": true, 00:24:19.377 "nvme_admin": false, 00:24:19.377 "nvme_io": false, 00:24:19.377 "nvme_io_md": false, 00:24:19.377 "write_zeroes": true, 00:24:19.377 "zcopy": true, 00:24:19.377 "get_zone_info": false, 00:24:19.377 "zone_management": false, 00:24:19.377 "zone_append": false, 00:24:19.377 "compare": false, 00:24:19.377 "compare_and_write": false, 00:24:19.377 "abort": true, 00:24:19.377 "seek_hole": false, 00:24:19.377 "seek_data": false, 00:24:19.377 "copy": true, 00:24:19.377 "nvme_iov_md": false 00:24:19.377 }, 00:24:19.377 "memory_domains": [ 00:24:19.377 { 00:24:19.377 "dma_device_id": "system", 00:24:19.377 "dma_device_type": 1 00:24:19.377 }, 00:24:19.377 { 00:24:19.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:19.377 "dma_device_type": 2 00:24:19.377 } 00:24:19.377 ], 00:24:19.377 "driver_specific": {} 00:24:19.377 } 00:24:19.377 ] 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:19.636 "name": "Existed_Raid", 00:24:19.636 "uuid": "6212c24b-21c8-4912-9c3d-a2d3957c6a6d", 00:24:19.636 "strip_size_kb": 0, 00:24:19.636 "state": "online", 00:24:19.636 "raid_level": "raid1", 00:24:19.636 "superblock": true, 00:24:19.636 "num_base_bdevs": 2, 00:24:19.636 "num_base_bdevs_discovered": 2, 00:24:19.636 "num_base_bdevs_operational": 2, 00:24:19.636 "base_bdevs_list": [ 00:24:19.636 { 00:24:19.636 "name": "BaseBdev1", 00:24:19.636 "uuid": "15ad3de8-7308-4bf5-a17e-da2a12ec2ca8", 00:24:19.636 "is_configured": true, 00:24:19.636 "data_offset": 256, 00:24:19.636 "data_size": 7936 00:24:19.636 }, 00:24:19.636 { 00:24:19.636 "name": "BaseBdev2", 00:24:19.636 "uuid": "8042c4ff-3441-48e7-9ba7-84c7ec2e1fb8", 00:24:19.636 "is_configured": true, 00:24:19.636 "data_offset": 256, 00:24:19.636 "data_size": 7936 00:24:19.636 } 00:24:19.636 ] 00:24:19.636 }' 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:19.636 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:20.204 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:24:20.204 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:20.204 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:20.204 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:20.204 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:20.204 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:24:20.204 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:20.204 08:00:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:20.464 [2024-07-15 08:00:05.013184] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:20.464 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:20.464 "name": "Existed_Raid", 00:24:20.464 "aliases": [ 00:24:20.464 "6212c24b-21c8-4912-9c3d-a2d3957c6a6d" 00:24:20.464 ], 00:24:20.464 "product_name": "Raid Volume", 00:24:20.464 "block_size": 4096, 00:24:20.464 "num_blocks": 7936, 00:24:20.464 "uuid": "6212c24b-21c8-4912-9c3d-a2d3957c6a6d", 00:24:20.464 "assigned_rate_limits": { 00:24:20.464 "rw_ios_per_sec": 0, 00:24:20.464 "rw_mbytes_per_sec": 0, 00:24:20.464 "r_mbytes_per_sec": 0, 00:24:20.464 "w_mbytes_per_sec": 0 00:24:20.464 }, 00:24:20.464 "claimed": false, 00:24:20.464 "zoned": false, 00:24:20.464 "supported_io_types": { 00:24:20.464 "read": true, 00:24:20.464 "write": true, 00:24:20.464 "unmap": false, 00:24:20.464 "flush": false, 00:24:20.464 "reset": true, 00:24:20.464 "nvme_admin": false, 00:24:20.464 "nvme_io": false, 00:24:20.464 "nvme_io_md": false, 00:24:20.464 "write_zeroes": true, 00:24:20.464 "zcopy": false, 00:24:20.464 "get_zone_info": false, 00:24:20.464 "zone_management": false, 00:24:20.464 "zone_append": false, 00:24:20.464 "compare": false, 00:24:20.464 "compare_and_write": false, 00:24:20.464 "abort": false, 00:24:20.464 "seek_hole": false, 00:24:20.464 "seek_data": false, 00:24:20.464 "copy": false, 00:24:20.464 "nvme_iov_md": false 00:24:20.464 }, 00:24:20.464 "memory_domains": [ 00:24:20.464 { 00:24:20.464 "dma_device_id": "system", 00:24:20.464 "dma_device_type": 1 00:24:20.464 }, 00:24:20.464 { 00:24:20.464 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.464 "dma_device_type": 2 00:24:20.464 }, 00:24:20.464 { 00:24:20.464 "dma_device_id": "system", 00:24:20.464 "dma_device_type": 1 00:24:20.464 }, 00:24:20.464 { 00:24:20.464 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.464 "dma_device_type": 2 00:24:20.464 } 00:24:20.464 ], 00:24:20.464 "driver_specific": { 00:24:20.464 "raid": { 00:24:20.464 "uuid": "6212c24b-21c8-4912-9c3d-a2d3957c6a6d", 00:24:20.464 "strip_size_kb": 0, 00:24:20.464 "state": "online", 00:24:20.464 "raid_level": "raid1", 00:24:20.464 "superblock": true, 00:24:20.464 "num_base_bdevs": 2, 00:24:20.464 "num_base_bdevs_discovered": 2, 00:24:20.464 "num_base_bdevs_operational": 2, 00:24:20.464 "base_bdevs_list": [ 00:24:20.464 { 00:24:20.464 "name": "BaseBdev1", 00:24:20.464 "uuid": "15ad3de8-7308-4bf5-a17e-da2a12ec2ca8", 00:24:20.464 "is_configured": true, 00:24:20.464 "data_offset": 256, 00:24:20.464 "data_size": 7936 00:24:20.464 }, 00:24:20.464 { 00:24:20.464 "name": "BaseBdev2", 00:24:20.464 "uuid": "8042c4ff-3441-48e7-9ba7-84c7ec2e1fb8", 00:24:20.464 "is_configured": true, 00:24:20.464 "data_offset": 256, 00:24:20.464 "data_size": 7936 00:24:20.464 } 00:24:20.464 ] 00:24:20.464 } 00:24:20.464 } 00:24:20.464 }' 00:24:20.464 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:20.464 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:24:20.464 BaseBdev2' 00:24:20.464 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:20.464 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:20.464 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:20.724 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:20.724 "name": "BaseBdev1", 00:24:20.724 "aliases": [ 00:24:20.724 "15ad3de8-7308-4bf5-a17e-da2a12ec2ca8" 00:24:20.724 ], 00:24:20.724 "product_name": "Malloc disk", 00:24:20.724 "block_size": 4096, 00:24:20.724 "num_blocks": 8192, 00:24:20.724 "uuid": "15ad3de8-7308-4bf5-a17e-da2a12ec2ca8", 00:24:20.724 "assigned_rate_limits": { 00:24:20.724 "rw_ios_per_sec": 0, 00:24:20.724 "rw_mbytes_per_sec": 0, 00:24:20.724 "r_mbytes_per_sec": 0, 00:24:20.724 "w_mbytes_per_sec": 0 00:24:20.724 }, 00:24:20.724 "claimed": true, 00:24:20.724 "claim_type": "exclusive_write", 00:24:20.724 "zoned": false, 00:24:20.724 "supported_io_types": { 00:24:20.724 "read": true, 00:24:20.724 "write": true, 00:24:20.724 "unmap": true, 00:24:20.724 "flush": true, 00:24:20.724 "reset": true, 00:24:20.724 "nvme_admin": false, 00:24:20.724 "nvme_io": false, 00:24:20.724 "nvme_io_md": false, 00:24:20.724 "write_zeroes": true, 00:24:20.724 "zcopy": true, 00:24:20.724 "get_zone_info": false, 00:24:20.724 "zone_management": false, 00:24:20.724 "zone_append": false, 00:24:20.724 "compare": false, 00:24:20.724 "compare_and_write": false, 00:24:20.724 "abort": true, 00:24:20.724 "seek_hole": false, 00:24:20.724 "seek_data": false, 00:24:20.724 "copy": true, 00:24:20.724 "nvme_iov_md": false 00:24:20.724 }, 00:24:20.724 "memory_domains": [ 00:24:20.724 { 00:24:20.724 "dma_device_id": "system", 00:24:20.724 "dma_device_type": 1 00:24:20.724 }, 00:24:20.724 { 00:24:20.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:20.724 "dma_device_type": 2 00:24:20.724 } 00:24:20.724 ], 00:24:20.724 "driver_specific": {} 00:24:20.724 }' 00:24:20.724 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:20.724 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:20.724 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:20.724 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:20.724 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:20.724 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:20.724 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:20.984 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:20.984 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:20.984 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:20.984 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:20.984 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:20.984 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:20.984 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:20.984 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:21.244 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:21.244 "name": "BaseBdev2", 00:24:21.244 "aliases": [ 00:24:21.244 "8042c4ff-3441-48e7-9ba7-84c7ec2e1fb8" 00:24:21.244 ], 00:24:21.244 "product_name": "Malloc disk", 00:24:21.244 "block_size": 4096, 00:24:21.244 "num_blocks": 8192, 00:24:21.244 "uuid": "8042c4ff-3441-48e7-9ba7-84c7ec2e1fb8", 00:24:21.244 "assigned_rate_limits": { 00:24:21.244 "rw_ios_per_sec": 0, 00:24:21.244 "rw_mbytes_per_sec": 0, 00:24:21.244 "r_mbytes_per_sec": 0, 00:24:21.244 "w_mbytes_per_sec": 0 00:24:21.244 }, 00:24:21.244 "claimed": true, 00:24:21.244 "claim_type": "exclusive_write", 00:24:21.244 "zoned": false, 00:24:21.244 "supported_io_types": { 00:24:21.244 "read": true, 00:24:21.244 "write": true, 00:24:21.244 "unmap": true, 00:24:21.244 "flush": true, 00:24:21.244 "reset": true, 00:24:21.244 "nvme_admin": false, 00:24:21.244 "nvme_io": false, 00:24:21.244 "nvme_io_md": false, 00:24:21.244 "write_zeroes": true, 00:24:21.244 "zcopy": true, 00:24:21.244 "get_zone_info": false, 00:24:21.244 "zone_management": false, 00:24:21.244 "zone_append": false, 00:24:21.244 "compare": false, 00:24:21.244 "compare_and_write": false, 00:24:21.244 "abort": true, 00:24:21.244 "seek_hole": false, 00:24:21.244 "seek_data": false, 00:24:21.244 "copy": true, 00:24:21.244 "nvme_iov_md": false 00:24:21.244 }, 00:24:21.244 "memory_domains": [ 00:24:21.244 { 00:24:21.244 "dma_device_id": "system", 00:24:21.244 "dma_device_type": 1 00:24:21.244 }, 00:24:21.244 { 00:24:21.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:21.244 "dma_device_type": 2 00:24:21.244 } 00:24:21.244 ], 00:24:21.244 "driver_specific": {} 00:24:21.244 }' 00:24:21.244 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:21.244 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:21.244 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:21.244 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:21.244 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:21.244 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:21.244 08:00:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:21.503 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:21.503 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:21.503 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:21.503 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:21.503 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:21.503 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:22.081 [2024-07-15 08:00:06.669200] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.082 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:22.340 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:22.340 "name": "Existed_Raid", 00:24:22.340 "uuid": "6212c24b-21c8-4912-9c3d-a2d3957c6a6d", 00:24:22.340 "strip_size_kb": 0, 00:24:22.340 "state": "online", 00:24:22.340 "raid_level": "raid1", 00:24:22.340 "superblock": true, 00:24:22.340 "num_base_bdevs": 2, 00:24:22.340 "num_base_bdevs_discovered": 1, 00:24:22.340 "num_base_bdevs_operational": 1, 00:24:22.340 "base_bdevs_list": [ 00:24:22.340 { 00:24:22.340 "name": null, 00:24:22.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.340 "is_configured": false, 00:24:22.340 "data_offset": 256, 00:24:22.340 "data_size": 7936 00:24:22.340 }, 00:24:22.340 { 00:24:22.340 "name": "BaseBdev2", 00:24:22.340 "uuid": "8042c4ff-3441-48e7-9ba7-84c7ec2e1fb8", 00:24:22.340 "is_configured": true, 00:24:22.340 "data_offset": 256, 00:24:22.340 "data_size": 7936 00:24:22.340 } 00:24:22.340 ] 00:24:22.340 }' 00:24:22.340 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:22.340 08:00:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:22.909 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:24:22.909 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:22.909 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:22.909 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.909 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:22.909 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:22.909 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:23.168 [2024-07-15 08:00:07.812078] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:23.168 [2024-07-15 08:00:07.812137] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:23.168 [2024-07-15 08:00:07.818166] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:23.168 [2024-07-15 08:00:07.818190] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:23.168 [2024-07-15 08:00:07.818196] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd5dd90 name Existed_Raid, state offline 00:24:23.168 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:23.168 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:23.168 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.168 08:00:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1741417 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1741417 ']' 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1741417 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1741417 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1741417' 00:24:23.427 killing process with pid 1741417 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1741417 00:24:23.427 [2024-07-15 08:00:08.073198] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:23.427 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1741417 00:24:23.427 [2024-07-15 08:00:08.073787] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:23.686 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:24:23.686 00:24:23.686 real 0m10.718s 00:24:23.686 user 0m19.577s 00:24:23.686 sys 0m1.514s 00:24:23.686 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:23.686 08:00:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:23.686 ************************************ 00:24:23.686 END TEST raid_state_function_test_sb_4k 00:24:23.686 ************************************ 00:24:23.686 08:00:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:23.686 08:00:08 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:24:23.686 08:00:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:24:23.686 08:00:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:23.686 08:00:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:23.686 ************************************ 00:24:23.686 START TEST raid_superblock_test_4k 00:24:23.686 ************************************ 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=1743586 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 1743586 /var/tmp/spdk-raid.sock 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 1743586 ']' 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:23.686 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:23.686 08:00:08 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:23.686 [2024-07-15 08:00:08.328803] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:24:23.686 [2024-07-15 08:00:08.328854] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1743586 ] 00:24:23.686 [2024-07-15 08:00:08.417415] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:23.945 [2024-07-15 08:00:08.485402] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:23.945 [2024-07-15 08:00:08.535667] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:23.945 [2024-07-15 08:00:08.535693] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:24.514 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:24:24.774 malloc1 00:24:24.774 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:25.034 [2024-07-15 08:00:09.538660] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:25.034 [2024-07-15 08:00:09.538693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:25.034 [2024-07-15 08:00:09.538704] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff1a20 00:24:25.034 [2024-07-15 08:00:09.538716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:25.034 [2024-07-15 08:00:09.540017] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:25.034 [2024-07-15 08:00:09.540036] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:25.034 pt1 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:24:25.034 malloc2 00:24:25.034 08:00:09 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:25.603 [2024-07-15 08:00:10.278611] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:25.603 [2024-07-15 08:00:10.278649] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:25.603 [2024-07-15 08:00:10.278662] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff2040 00:24:25.603 [2024-07-15 08:00:10.278668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:25.603 [2024-07-15 08:00:10.279918] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:25.603 [2024-07-15 08:00:10.279938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:25.603 pt2 00:24:25.603 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:25.603 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:25.603 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:24:25.863 [2024-07-15 08:00:10.487138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:25.863 [2024-07-15 08:00:10.488168] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:25.863 [2024-07-15 08:00:10.488283] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x219e3d0 00:24:25.863 [2024-07-15 08:00:10.488291] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:25.863 [2024-07-15 08:00:10.488441] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2008910 00:24:25.863 [2024-07-15 08:00:10.488553] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x219e3d0 00:24:25.863 [2024-07-15 08:00:10.488559] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x219e3d0 00:24:25.864 [2024-07-15 08:00:10.488631] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.864 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.124 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.124 "name": "raid_bdev1", 00:24:26.124 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:26.124 "strip_size_kb": 0, 00:24:26.124 "state": "online", 00:24:26.124 "raid_level": "raid1", 00:24:26.124 "superblock": true, 00:24:26.124 "num_base_bdevs": 2, 00:24:26.124 "num_base_bdevs_discovered": 2, 00:24:26.124 "num_base_bdevs_operational": 2, 00:24:26.124 "base_bdevs_list": [ 00:24:26.124 { 00:24:26.124 "name": "pt1", 00:24:26.124 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:26.124 "is_configured": true, 00:24:26.124 "data_offset": 256, 00:24:26.124 "data_size": 7936 00:24:26.124 }, 00:24:26.124 { 00:24:26.124 "name": "pt2", 00:24:26.124 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:26.124 "is_configured": true, 00:24:26.124 "data_offset": 256, 00:24:26.124 "data_size": 7936 00:24:26.124 } 00:24:26.124 ] 00:24:26.124 }' 00:24:26.124 08:00:10 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.124 08:00:10 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:26.693 [2024-07-15 08:00:11.413661] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:26.693 "name": "raid_bdev1", 00:24:26.693 "aliases": [ 00:24:26.693 "a86dbe4e-1587-4899-835a-fbe44fa5a01b" 00:24:26.693 ], 00:24:26.693 "product_name": "Raid Volume", 00:24:26.693 "block_size": 4096, 00:24:26.693 "num_blocks": 7936, 00:24:26.693 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:26.693 "assigned_rate_limits": { 00:24:26.693 "rw_ios_per_sec": 0, 00:24:26.693 "rw_mbytes_per_sec": 0, 00:24:26.693 "r_mbytes_per_sec": 0, 00:24:26.693 "w_mbytes_per_sec": 0 00:24:26.693 }, 00:24:26.693 "claimed": false, 00:24:26.693 "zoned": false, 00:24:26.693 "supported_io_types": { 00:24:26.693 "read": true, 00:24:26.693 "write": true, 00:24:26.693 "unmap": false, 00:24:26.693 "flush": false, 00:24:26.693 "reset": true, 00:24:26.693 "nvme_admin": false, 00:24:26.693 "nvme_io": false, 00:24:26.693 "nvme_io_md": false, 00:24:26.693 "write_zeroes": true, 00:24:26.693 "zcopy": false, 00:24:26.693 "get_zone_info": false, 00:24:26.693 "zone_management": false, 00:24:26.693 "zone_append": false, 00:24:26.693 "compare": false, 00:24:26.693 "compare_and_write": false, 00:24:26.693 "abort": false, 00:24:26.693 "seek_hole": false, 00:24:26.693 "seek_data": false, 00:24:26.693 "copy": false, 00:24:26.693 "nvme_iov_md": false 00:24:26.693 }, 00:24:26.693 "memory_domains": [ 00:24:26.693 { 00:24:26.693 "dma_device_id": "system", 00:24:26.693 "dma_device_type": 1 00:24:26.693 }, 00:24:26.693 { 00:24:26.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:26.693 "dma_device_type": 2 00:24:26.693 }, 00:24:26.693 { 00:24:26.693 "dma_device_id": "system", 00:24:26.693 "dma_device_type": 1 00:24:26.693 }, 00:24:26.693 { 00:24:26.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:26.693 "dma_device_type": 2 00:24:26.693 } 00:24:26.693 ], 00:24:26.693 "driver_specific": { 00:24:26.693 "raid": { 00:24:26.693 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:26.693 "strip_size_kb": 0, 00:24:26.693 "state": "online", 00:24:26.693 "raid_level": "raid1", 00:24:26.693 "superblock": true, 00:24:26.693 "num_base_bdevs": 2, 00:24:26.693 "num_base_bdevs_discovered": 2, 00:24:26.693 "num_base_bdevs_operational": 2, 00:24:26.693 "base_bdevs_list": [ 00:24:26.693 { 00:24:26.693 "name": "pt1", 00:24:26.693 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:26.693 "is_configured": true, 00:24:26.693 "data_offset": 256, 00:24:26.693 "data_size": 7936 00:24:26.693 }, 00:24:26.693 { 00:24:26.693 "name": "pt2", 00:24:26.693 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:26.693 "is_configured": true, 00:24:26.693 "data_offset": 256, 00:24:26.693 "data_size": 7936 00:24:26.693 } 00:24:26.693 ] 00:24:26.693 } 00:24:26.693 } 00:24:26.693 }' 00:24:26.693 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:26.953 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:26.953 pt2' 00:24:26.953 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:26.953 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:26.953 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:26.953 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:26.953 "name": "pt1", 00:24:26.953 "aliases": [ 00:24:26.953 "00000000-0000-0000-0000-000000000001" 00:24:26.953 ], 00:24:26.953 "product_name": "passthru", 00:24:26.953 "block_size": 4096, 00:24:26.953 "num_blocks": 8192, 00:24:26.953 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:26.953 "assigned_rate_limits": { 00:24:26.953 "rw_ios_per_sec": 0, 00:24:26.953 "rw_mbytes_per_sec": 0, 00:24:26.953 "r_mbytes_per_sec": 0, 00:24:26.953 "w_mbytes_per_sec": 0 00:24:26.953 }, 00:24:26.953 "claimed": true, 00:24:26.953 "claim_type": "exclusive_write", 00:24:26.953 "zoned": false, 00:24:26.953 "supported_io_types": { 00:24:26.953 "read": true, 00:24:26.953 "write": true, 00:24:26.953 "unmap": true, 00:24:26.953 "flush": true, 00:24:26.953 "reset": true, 00:24:26.953 "nvme_admin": false, 00:24:26.953 "nvme_io": false, 00:24:26.953 "nvme_io_md": false, 00:24:26.953 "write_zeroes": true, 00:24:26.953 "zcopy": true, 00:24:26.953 "get_zone_info": false, 00:24:26.953 "zone_management": false, 00:24:26.953 "zone_append": false, 00:24:26.953 "compare": false, 00:24:26.953 "compare_and_write": false, 00:24:26.953 "abort": true, 00:24:26.953 "seek_hole": false, 00:24:26.953 "seek_data": false, 00:24:26.953 "copy": true, 00:24:26.953 "nvme_iov_md": false 00:24:26.953 }, 00:24:26.953 "memory_domains": [ 00:24:26.953 { 00:24:26.953 "dma_device_id": "system", 00:24:26.953 "dma_device_type": 1 00:24:26.953 }, 00:24:26.953 { 00:24:26.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:26.953 "dma_device_type": 2 00:24:26.953 } 00:24:26.953 ], 00:24:26.953 "driver_specific": { 00:24:26.953 "passthru": { 00:24:26.953 "name": "pt1", 00:24:26.953 "base_bdev_name": "malloc1" 00:24:26.953 } 00:24:26.953 } 00:24:26.953 }' 00:24:26.953 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:26.953 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:27.213 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:27.213 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:27.213 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:27.213 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:27.213 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:27.213 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:27.213 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:27.213 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:27.473 08:00:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:27.473 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:27.473 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:27.473 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:27.473 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:27.473 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:27.473 "name": "pt2", 00:24:27.473 "aliases": [ 00:24:27.473 "00000000-0000-0000-0000-000000000002" 00:24:27.473 ], 00:24:27.473 "product_name": "passthru", 00:24:27.473 "block_size": 4096, 00:24:27.473 "num_blocks": 8192, 00:24:27.473 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:27.473 "assigned_rate_limits": { 00:24:27.473 "rw_ios_per_sec": 0, 00:24:27.473 "rw_mbytes_per_sec": 0, 00:24:27.473 "r_mbytes_per_sec": 0, 00:24:27.473 "w_mbytes_per_sec": 0 00:24:27.473 }, 00:24:27.473 "claimed": true, 00:24:27.473 "claim_type": "exclusive_write", 00:24:27.473 "zoned": false, 00:24:27.473 "supported_io_types": { 00:24:27.473 "read": true, 00:24:27.473 "write": true, 00:24:27.473 "unmap": true, 00:24:27.473 "flush": true, 00:24:27.473 "reset": true, 00:24:27.473 "nvme_admin": false, 00:24:27.473 "nvme_io": false, 00:24:27.473 "nvme_io_md": false, 00:24:27.473 "write_zeroes": true, 00:24:27.473 "zcopy": true, 00:24:27.473 "get_zone_info": false, 00:24:27.473 "zone_management": false, 00:24:27.473 "zone_append": false, 00:24:27.473 "compare": false, 00:24:27.473 "compare_and_write": false, 00:24:27.473 "abort": true, 00:24:27.473 "seek_hole": false, 00:24:27.473 "seek_data": false, 00:24:27.473 "copy": true, 00:24:27.473 "nvme_iov_md": false 00:24:27.473 }, 00:24:27.473 "memory_domains": [ 00:24:27.473 { 00:24:27.473 "dma_device_id": "system", 00:24:27.473 "dma_device_type": 1 00:24:27.473 }, 00:24:27.473 { 00:24:27.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:27.473 "dma_device_type": 2 00:24:27.473 } 00:24:27.473 ], 00:24:27.473 "driver_specific": { 00:24:27.473 "passthru": { 00:24:27.473 "name": "pt2", 00:24:27.473 "base_bdev_name": "malloc2" 00:24:27.473 } 00:24:27.473 } 00:24:27.473 }' 00:24:27.473 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:27.732 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:27.992 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:27.992 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:27.992 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:27.992 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:24:27.992 [2024-07-15 08:00:12.737021] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:28.252 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a86dbe4e-1587-4899-835a-fbe44fa5a01b 00:24:28.252 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z a86dbe4e-1587-4899-835a-fbe44fa5a01b ']' 00:24:28.252 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:28.252 [2024-07-15 08:00:12.929313] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:28.252 [2024-07-15 08:00:12.929323] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:28.252 [2024-07-15 08:00:12.929356] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:28.252 [2024-07-15 08:00:12.929395] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:28.252 [2024-07-15 08:00:12.929401] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219e3d0 name raid_bdev1, state offline 00:24:28.252 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.252 08:00:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:24:28.512 08:00:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:24:28.512 08:00:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:24:28.512 08:00:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:28.512 08:00:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:28.771 08:00:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:28.771 08:00:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:29.361 08:00:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:24:29.361 08:00:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:29.361 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:29.621 [2024-07-15 08:00:14.248598] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:24:29.621 [2024-07-15 08:00:14.249654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:24:29.621 [2024-07-15 08:00:14.249694] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:24:29.621 [2024-07-15 08:00:14.249725] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:24:29.621 [2024-07-15 08:00:14.249736] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:29.621 [2024-07-15 08:00:14.249741] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219c960 name raid_bdev1, state configuring 00:24:29.621 request: 00:24:29.621 { 00:24:29.621 "name": "raid_bdev1", 00:24:29.621 "raid_level": "raid1", 00:24:29.621 "base_bdevs": [ 00:24:29.621 "malloc1", 00:24:29.621 "malloc2" 00:24:29.621 ], 00:24:29.621 "superblock": false, 00:24:29.621 "method": "bdev_raid_create", 00:24:29.621 "req_id": 1 00:24:29.621 } 00:24:29.621 Got JSON-RPC error response 00:24:29.621 response: 00:24:29.621 { 00:24:29.621 "code": -17, 00:24:29.621 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:24:29.621 } 00:24:29.621 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:24:29.621 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:29.621 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:29.621 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:29.621 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.621 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:24:29.881 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:24:29.881 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:24:29.881 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:29.881 [2024-07-15 08:00:14.633536] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:29.881 [2024-07-15 08:00:14.633561] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.881 [2024-07-15 08:00:14.633574] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x219dba0 00:24:29.881 [2024-07-15 08:00:14.633580] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.881 [2024-07-15 08:00:14.634838] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.881 [2024-07-15 08:00:14.634857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:29.881 [2024-07-15 08:00:14.634903] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:29.881 [2024-07-15 08:00:14.634919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:30.152 pt1 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:30.152 "name": "raid_bdev1", 00:24:30.152 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:30.152 "strip_size_kb": 0, 00:24:30.152 "state": "configuring", 00:24:30.152 "raid_level": "raid1", 00:24:30.152 "superblock": true, 00:24:30.152 "num_base_bdevs": 2, 00:24:30.152 "num_base_bdevs_discovered": 1, 00:24:30.152 "num_base_bdevs_operational": 2, 00:24:30.152 "base_bdevs_list": [ 00:24:30.152 { 00:24:30.152 "name": "pt1", 00:24:30.152 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:30.152 "is_configured": true, 00:24:30.152 "data_offset": 256, 00:24:30.152 "data_size": 7936 00:24:30.152 }, 00:24:30.152 { 00:24:30.152 "name": null, 00:24:30.152 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:30.152 "is_configured": false, 00:24:30.152 "data_offset": 256, 00:24:30.152 "data_size": 7936 00:24:30.152 } 00:24:30.152 ] 00:24:30.152 }' 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:30.152 08:00:14 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:30.720 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:24:30.720 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:24:30.720 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:30.720 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:30.980 [2024-07-15 08:00:15.559887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:30.980 [2024-07-15 08:00:15.559913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:30.980 [2024-07-15 08:00:15.559929] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff2e00 00:24:30.980 [2024-07-15 08:00:15.559937] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:30.980 [2024-07-15 08:00:15.560189] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:30.980 [2024-07-15 08:00:15.560200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:30.980 [2024-07-15 08:00:15.560237] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:30.980 [2024-07-15 08:00:15.560248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:30.980 [2024-07-15 08:00:15.560319] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ff0820 00:24:30.980 [2024-07-15 08:00:15.560325] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:30.980 [2024-07-15 08:00:15.560458] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x219fe90 00:24:30.980 [2024-07-15 08:00:15.560556] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ff0820 00:24:30.980 [2024-07-15 08:00:15.560561] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ff0820 00:24:30.980 [2024-07-15 08:00:15.560635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:30.980 pt2 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.980 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.240 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:31.240 "name": "raid_bdev1", 00:24:31.240 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:31.240 "strip_size_kb": 0, 00:24:31.240 "state": "online", 00:24:31.240 "raid_level": "raid1", 00:24:31.240 "superblock": true, 00:24:31.240 "num_base_bdevs": 2, 00:24:31.240 "num_base_bdevs_discovered": 2, 00:24:31.240 "num_base_bdevs_operational": 2, 00:24:31.240 "base_bdevs_list": [ 00:24:31.240 { 00:24:31.240 "name": "pt1", 00:24:31.240 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:31.240 "is_configured": true, 00:24:31.240 "data_offset": 256, 00:24:31.240 "data_size": 7936 00:24:31.240 }, 00:24:31.240 { 00:24:31.240 "name": "pt2", 00:24:31.240 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:31.240 "is_configured": true, 00:24:31.240 "data_offset": 256, 00:24:31.240 "data_size": 7936 00:24:31.240 } 00:24:31.240 ] 00:24:31.240 }' 00:24:31.240 08:00:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:31.240 08:00:15 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:31.847 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:24:31.847 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:31.848 [2024-07-15 08:00:16.486434] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:31.848 "name": "raid_bdev1", 00:24:31.848 "aliases": [ 00:24:31.848 "a86dbe4e-1587-4899-835a-fbe44fa5a01b" 00:24:31.848 ], 00:24:31.848 "product_name": "Raid Volume", 00:24:31.848 "block_size": 4096, 00:24:31.848 "num_blocks": 7936, 00:24:31.848 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:31.848 "assigned_rate_limits": { 00:24:31.848 "rw_ios_per_sec": 0, 00:24:31.848 "rw_mbytes_per_sec": 0, 00:24:31.848 "r_mbytes_per_sec": 0, 00:24:31.848 "w_mbytes_per_sec": 0 00:24:31.848 }, 00:24:31.848 "claimed": false, 00:24:31.848 "zoned": false, 00:24:31.848 "supported_io_types": { 00:24:31.848 "read": true, 00:24:31.848 "write": true, 00:24:31.848 "unmap": false, 00:24:31.848 "flush": false, 00:24:31.848 "reset": true, 00:24:31.848 "nvme_admin": false, 00:24:31.848 "nvme_io": false, 00:24:31.848 "nvme_io_md": false, 00:24:31.848 "write_zeroes": true, 00:24:31.848 "zcopy": false, 00:24:31.848 "get_zone_info": false, 00:24:31.848 "zone_management": false, 00:24:31.848 "zone_append": false, 00:24:31.848 "compare": false, 00:24:31.848 "compare_and_write": false, 00:24:31.848 "abort": false, 00:24:31.848 "seek_hole": false, 00:24:31.848 "seek_data": false, 00:24:31.848 "copy": false, 00:24:31.848 "nvme_iov_md": false 00:24:31.848 }, 00:24:31.848 "memory_domains": [ 00:24:31.848 { 00:24:31.848 "dma_device_id": "system", 00:24:31.848 "dma_device_type": 1 00:24:31.848 }, 00:24:31.848 { 00:24:31.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:31.848 "dma_device_type": 2 00:24:31.848 }, 00:24:31.848 { 00:24:31.848 "dma_device_id": "system", 00:24:31.848 "dma_device_type": 1 00:24:31.848 }, 00:24:31.848 { 00:24:31.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:31.848 "dma_device_type": 2 00:24:31.848 } 00:24:31.848 ], 00:24:31.848 "driver_specific": { 00:24:31.848 "raid": { 00:24:31.848 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:31.848 "strip_size_kb": 0, 00:24:31.848 "state": "online", 00:24:31.848 "raid_level": "raid1", 00:24:31.848 "superblock": true, 00:24:31.848 "num_base_bdevs": 2, 00:24:31.848 "num_base_bdevs_discovered": 2, 00:24:31.848 "num_base_bdevs_operational": 2, 00:24:31.848 "base_bdevs_list": [ 00:24:31.848 { 00:24:31.848 "name": "pt1", 00:24:31.848 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:31.848 "is_configured": true, 00:24:31.848 "data_offset": 256, 00:24:31.848 "data_size": 7936 00:24:31.848 }, 00:24:31.848 { 00:24:31.848 "name": "pt2", 00:24:31.848 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:31.848 "is_configured": true, 00:24:31.848 "data_offset": 256, 00:24:31.848 "data_size": 7936 00:24:31.848 } 00:24:31.848 ] 00:24:31.848 } 00:24:31.848 } 00:24:31.848 }' 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:31.848 pt2' 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:31.848 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:32.125 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:32.125 "name": "pt1", 00:24:32.125 "aliases": [ 00:24:32.125 "00000000-0000-0000-0000-000000000001" 00:24:32.125 ], 00:24:32.125 "product_name": "passthru", 00:24:32.125 "block_size": 4096, 00:24:32.125 "num_blocks": 8192, 00:24:32.125 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:32.125 "assigned_rate_limits": { 00:24:32.125 "rw_ios_per_sec": 0, 00:24:32.125 "rw_mbytes_per_sec": 0, 00:24:32.125 "r_mbytes_per_sec": 0, 00:24:32.125 "w_mbytes_per_sec": 0 00:24:32.125 }, 00:24:32.125 "claimed": true, 00:24:32.125 "claim_type": "exclusive_write", 00:24:32.125 "zoned": false, 00:24:32.125 "supported_io_types": { 00:24:32.125 "read": true, 00:24:32.125 "write": true, 00:24:32.125 "unmap": true, 00:24:32.125 "flush": true, 00:24:32.125 "reset": true, 00:24:32.125 "nvme_admin": false, 00:24:32.125 "nvme_io": false, 00:24:32.125 "nvme_io_md": false, 00:24:32.125 "write_zeroes": true, 00:24:32.125 "zcopy": true, 00:24:32.125 "get_zone_info": false, 00:24:32.125 "zone_management": false, 00:24:32.125 "zone_append": false, 00:24:32.125 "compare": false, 00:24:32.125 "compare_and_write": false, 00:24:32.125 "abort": true, 00:24:32.125 "seek_hole": false, 00:24:32.125 "seek_data": false, 00:24:32.126 "copy": true, 00:24:32.126 "nvme_iov_md": false 00:24:32.126 }, 00:24:32.126 "memory_domains": [ 00:24:32.126 { 00:24:32.126 "dma_device_id": "system", 00:24:32.126 "dma_device_type": 1 00:24:32.126 }, 00:24:32.126 { 00:24:32.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:32.126 "dma_device_type": 2 00:24:32.126 } 00:24:32.126 ], 00:24:32.126 "driver_specific": { 00:24:32.126 "passthru": { 00:24:32.126 "name": "pt1", 00:24:32.126 "base_bdev_name": "malloc1" 00:24:32.126 } 00:24:32.126 } 00:24:32.126 }' 00:24:32.126 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:32.126 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:32.126 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:32.126 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:32.126 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:32.384 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:32.384 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:32.384 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:32.384 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:32.384 08:00:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:32.384 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:32.384 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:32.384 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:32.384 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:32.384 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:32.643 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:32.643 "name": "pt2", 00:24:32.643 "aliases": [ 00:24:32.643 "00000000-0000-0000-0000-000000000002" 00:24:32.643 ], 00:24:32.643 "product_name": "passthru", 00:24:32.643 "block_size": 4096, 00:24:32.643 "num_blocks": 8192, 00:24:32.643 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:32.643 "assigned_rate_limits": { 00:24:32.643 "rw_ios_per_sec": 0, 00:24:32.643 "rw_mbytes_per_sec": 0, 00:24:32.643 "r_mbytes_per_sec": 0, 00:24:32.643 "w_mbytes_per_sec": 0 00:24:32.643 }, 00:24:32.643 "claimed": true, 00:24:32.643 "claim_type": "exclusive_write", 00:24:32.643 "zoned": false, 00:24:32.643 "supported_io_types": { 00:24:32.643 "read": true, 00:24:32.643 "write": true, 00:24:32.643 "unmap": true, 00:24:32.643 "flush": true, 00:24:32.643 "reset": true, 00:24:32.643 "nvme_admin": false, 00:24:32.643 "nvme_io": false, 00:24:32.643 "nvme_io_md": false, 00:24:32.643 "write_zeroes": true, 00:24:32.643 "zcopy": true, 00:24:32.643 "get_zone_info": false, 00:24:32.643 "zone_management": false, 00:24:32.643 "zone_append": false, 00:24:32.643 "compare": false, 00:24:32.643 "compare_and_write": false, 00:24:32.643 "abort": true, 00:24:32.643 "seek_hole": false, 00:24:32.643 "seek_data": false, 00:24:32.643 "copy": true, 00:24:32.643 "nvme_iov_md": false 00:24:32.643 }, 00:24:32.643 "memory_domains": [ 00:24:32.643 { 00:24:32.643 "dma_device_id": "system", 00:24:32.643 "dma_device_type": 1 00:24:32.643 }, 00:24:32.643 { 00:24:32.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:32.643 "dma_device_type": 2 00:24:32.643 } 00:24:32.643 ], 00:24:32.643 "driver_specific": { 00:24:32.643 "passthru": { 00:24:32.643 "name": "pt2", 00:24:32.643 "base_bdev_name": "malloc2" 00:24:32.643 } 00:24:32.643 } 00:24:32.643 }' 00:24:32.643 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:32.643 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:32.643 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:24:32.643 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:32.902 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:24:33.161 [2024-07-15 08:00:17.809771] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:33.161 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' a86dbe4e-1587-4899-835a-fbe44fa5a01b '!=' a86dbe4e-1587-4899-835a-fbe44fa5a01b ']' 00:24:33.161 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:24:33.161 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:33.161 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:24:33.161 08:00:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:33.421 [2024-07-15 08:00:18.002079] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.421 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.681 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.681 "name": "raid_bdev1", 00:24:33.681 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:33.681 "strip_size_kb": 0, 00:24:33.681 "state": "online", 00:24:33.681 "raid_level": "raid1", 00:24:33.681 "superblock": true, 00:24:33.681 "num_base_bdevs": 2, 00:24:33.681 "num_base_bdevs_discovered": 1, 00:24:33.681 "num_base_bdevs_operational": 1, 00:24:33.681 "base_bdevs_list": [ 00:24:33.681 { 00:24:33.681 "name": null, 00:24:33.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.681 "is_configured": false, 00:24:33.681 "data_offset": 256, 00:24:33.681 "data_size": 7936 00:24:33.681 }, 00:24:33.681 { 00:24:33.681 "name": "pt2", 00:24:33.681 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:33.681 "is_configured": true, 00:24:33.681 "data_offset": 256, 00:24:33.681 "data_size": 7936 00:24:33.681 } 00:24:33.681 ] 00:24:33.681 }' 00:24:33.681 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.681 08:00:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:34.250 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:34.250 [2024-07-15 08:00:18.924387] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:34.250 [2024-07-15 08:00:18.924403] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:34.250 [2024-07-15 08:00:18.924435] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:34.250 [2024-07-15 08:00:18.924465] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:34.250 [2024-07-15 08:00:18.924471] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ff0820 name raid_bdev1, state offline 00:24:34.250 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.250 08:00:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:24:34.509 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:24:34.509 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:24:34.509 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:24:34.509 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:34.509 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:34.768 [2024-07-15 08:00:19.405586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:34.768 [2024-07-15 08:00:19.405613] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:34.768 [2024-07-15 08:00:19.405623] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x219ad70 00:24:34.768 [2024-07-15 08:00:19.405630] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:34.768 [2024-07-15 08:00:19.406880] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:34.768 [2024-07-15 08:00:19.406899] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:34.768 [2024-07-15 08:00:19.406941] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:34.768 [2024-07-15 08:00:19.406958] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:34.768 [2024-07-15 08:00:19.407017] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x219fae0 00:24:34.768 [2024-07-15 08:00:19.407023] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:34.768 [2024-07-15 08:00:19.407158] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x219d160 00:24:34.768 [2024-07-15 08:00:19.407250] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x219fae0 00:24:34.768 [2024-07-15 08:00:19.407255] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x219fae0 00:24:34.768 [2024-07-15 08:00:19.407325] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:34.768 pt2 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.768 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.027 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:35.027 "name": "raid_bdev1", 00:24:35.027 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:35.027 "strip_size_kb": 0, 00:24:35.027 "state": "online", 00:24:35.027 "raid_level": "raid1", 00:24:35.027 "superblock": true, 00:24:35.027 "num_base_bdevs": 2, 00:24:35.027 "num_base_bdevs_discovered": 1, 00:24:35.027 "num_base_bdevs_operational": 1, 00:24:35.027 "base_bdevs_list": [ 00:24:35.027 { 00:24:35.027 "name": null, 00:24:35.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.027 "is_configured": false, 00:24:35.027 "data_offset": 256, 00:24:35.027 "data_size": 7936 00:24:35.027 }, 00:24:35.027 { 00:24:35.027 "name": "pt2", 00:24:35.027 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:35.027 "is_configured": true, 00:24:35.027 "data_offset": 256, 00:24:35.027 "data_size": 7936 00:24:35.027 } 00:24:35.027 ] 00:24:35.027 }' 00:24:35.027 08:00:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:35.027 08:00:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:35.596 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:35.596 [2024-07-15 08:00:20.291816] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:35.596 [2024-07-15 08:00:20.291836] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:35.596 [2024-07-15 08:00:20.291872] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:35.596 [2024-07-15 08:00:20.291901] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:35.596 [2024-07-15 08:00:20.291907] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219fae0 name raid_bdev1, state offline 00:24:35.596 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.596 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:24:35.854 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:24:35.855 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:24:35.855 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:24:35.855 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:36.114 [2024-07-15 08:00:20.684804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:36.114 [2024-07-15 08:00:20.684835] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:36.114 [2024-07-15 08:00:20.684847] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff23e0 00:24:36.114 [2024-07-15 08:00:20.684854] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:36.114 [2024-07-15 08:00:20.686122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:36.114 [2024-07-15 08:00:20.686141] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:36.114 [2024-07-15 08:00:20.686187] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:36.114 [2024-07-15 08:00:20.686203] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:36.114 [2024-07-15 08:00:20.686278] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:24:36.114 [2024-07-15 08:00:20.686285] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:36.114 [2024-07-15 08:00:20.686298] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219ffa0 name raid_bdev1, state configuring 00:24:36.114 [2024-07-15 08:00:20.686313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:36.114 [2024-07-15 08:00:20.686352] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x219ffa0 00:24:36.114 [2024-07-15 08:00:20.686357] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:36.114 [2024-07-15 08:00:20.686493] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x219c850 00:24:36.114 [2024-07-15 08:00:20.686586] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x219ffa0 00:24:36.114 [2024-07-15 08:00:20.686591] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x219ffa0 00:24:36.114 [2024-07-15 08:00:20.686665] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:36.114 pt1 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.114 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.373 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:36.373 "name": "raid_bdev1", 00:24:36.373 "uuid": "a86dbe4e-1587-4899-835a-fbe44fa5a01b", 00:24:36.373 "strip_size_kb": 0, 00:24:36.373 "state": "online", 00:24:36.373 "raid_level": "raid1", 00:24:36.373 "superblock": true, 00:24:36.373 "num_base_bdevs": 2, 00:24:36.373 "num_base_bdevs_discovered": 1, 00:24:36.373 "num_base_bdevs_operational": 1, 00:24:36.373 "base_bdevs_list": [ 00:24:36.373 { 00:24:36.373 "name": null, 00:24:36.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.373 "is_configured": false, 00:24:36.373 "data_offset": 256, 00:24:36.373 "data_size": 7936 00:24:36.373 }, 00:24:36.373 { 00:24:36.373 "name": "pt2", 00:24:36.373 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:36.373 "is_configured": true, 00:24:36.373 "data_offset": 256, 00:24:36.373 "data_size": 7936 00:24:36.373 } 00:24:36.373 ] 00:24:36.373 }' 00:24:36.373 08:00:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:36.373 08:00:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:36.941 08:00:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:36.941 08:00:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:36.941 08:00:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:24:36.941 08:00:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:36.941 08:00:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:24:37.201 [2024-07-15 08:00:21.823840] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' a86dbe4e-1587-4899-835a-fbe44fa5a01b '!=' a86dbe4e-1587-4899-835a-fbe44fa5a01b ']' 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 1743586 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 1743586 ']' 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 1743586 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1743586 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1743586' 00:24:37.201 killing process with pid 1743586 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 1743586 00:24:37.201 [2024-07-15 08:00:21.894109] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:37.201 [2024-07-15 08:00:21.894144] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:37.201 [2024-07-15 08:00:21.894173] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:37.201 [2024-07-15 08:00:21.894179] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219ffa0 name raid_bdev1, state offline 00:24:37.201 08:00:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 1743586 00:24:37.201 [2024-07-15 08:00:21.903364] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:37.462 08:00:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:24:37.462 00:24:37.462 real 0m13.750s 00:24:37.462 user 0m25.519s 00:24:37.462 sys 0m2.058s 00:24:37.462 08:00:22 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:37.462 08:00:22 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:24:37.462 ************************************ 00:24:37.462 END TEST raid_superblock_test_4k 00:24:37.462 ************************************ 00:24:37.462 08:00:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:37.462 08:00:22 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:24:37.462 08:00:22 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:24:37.462 08:00:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:37.462 08:00:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:37.462 08:00:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:37.462 ************************************ 00:24:37.462 START TEST raid_rebuild_test_sb_4k 00:24:37.462 ************************************ 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=1746596 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 1746596 /var/tmp/spdk-raid.sock 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1746596 ']' 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:37.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:37.462 08:00:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:37.462 [2024-07-15 08:00:22.177820] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:24:37.462 [2024-07-15 08:00:22.177870] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1746596 ] 00:24:37.463 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:37.463 Zero copy mechanism will not be used. 00:24:37.723 [2024-07-15 08:00:22.268580] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:37.723 [2024-07-15 08:00:22.346680] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:37.723 [2024-07-15 08:00:22.392051] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:37.723 [2024-07-15 08:00:22.392077] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:38.293 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:38.293 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:24:38.293 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:38.293 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:24:38.553 BaseBdev1_malloc 00:24:38.553 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:38.812 [2024-07-15 08:00:23.382941] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:38.812 [2024-07-15 08:00:23.382982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.812 [2024-07-15 08:00:23.382995] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf26d30 00:24:38.812 [2024-07-15 08:00:23.383002] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.812 [2024-07-15 08:00:23.384265] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.812 [2024-07-15 08:00:23.384284] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:38.812 BaseBdev1 00:24:38.812 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:38.812 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:24:39.072 BaseBdev2_malloc 00:24:39.072 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:39.072 [2024-07-15 08:00:23.753730] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:39.072 [2024-07-15 08:00:23.753755] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.072 [2024-07-15 08:00:23.753766] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d9c60 00:24:39.072 [2024-07-15 08:00:23.753772] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.072 [2024-07-15 08:00:23.754929] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.072 [2024-07-15 08:00:23.754947] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:39.072 BaseBdev2 00:24:39.072 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:24:39.332 spare_malloc 00:24:39.332 08:00:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:39.591 spare_delay 00:24:39.591 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:39.591 [2024-07-15 08:00:24.320810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:39.591 [2024-07-15 08:00:24.320836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.591 [2024-07-15 08:00:24.320846] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10c9ec0 00:24:39.591 [2024-07-15 08:00:24.320853] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.591 [2024-07-15 08:00:24.322003] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.591 [2024-07-15 08:00:24.322021] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:39.591 spare 00:24:39.591 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:39.851 [2024-07-15 08:00:24.517327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:39.851 [2024-07-15 08:00:24.518309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:39.851 [2024-07-15 08:00:24.518422] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10c1390 00:24:39.851 [2024-07-15 08:00:24.518430] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:39.851 [2024-07-15 08:00:24.518571] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1d7c0 00:24:39.851 [2024-07-15 08:00:24.518677] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10c1390 00:24:39.851 [2024-07-15 08:00:24.518683] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10c1390 00:24:39.851 [2024-07-15 08:00:24.518759] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.851 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:40.111 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:40.111 "name": "raid_bdev1", 00:24:40.111 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:40.111 "strip_size_kb": 0, 00:24:40.111 "state": "online", 00:24:40.111 "raid_level": "raid1", 00:24:40.111 "superblock": true, 00:24:40.111 "num_base_bdevs": 2, 00:24:40.111 "num_base_bdevs_discovered": 2, 00:24:40.111 "num_base_bdevs_operational": 2, 00:24:40.111 "base_bdevs_list": [ 00:24:40.111 { 00:24:40.111 "name": "BaseBdev1", 00:24:40.111 "uuid": "9fe82a78-3796-528a-b042-a323be6bb817", 00:24:40.111 "is_configured": true, 00:24:40.111 "data_offset": 256, 00:24:40.111 "data_size": 7936 00:24:40.111 }, 00:24:40.111 { 00:24:40.111 "name": "BaseBdev2", 00:24:40.111 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:40.111 "is_configured": true, 00:24:40.111 "data_offset": 256, 00:24:40.111 "data_size": 7936 00:24:40.111 } 00:24:40.111 ] 00:24:40.111 }' 00:24:40.111 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:40.111 08:00:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:40.678 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:40.678 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:40.938 [2024-07-15 08:00:25.447872] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:40.938 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:41.197 [2024-07-15 08:00:25.836673] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10bfb00 00:24:41.197 /dev/nbd0 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:41.197 1+0 records in 00:24:41.197 1+0 records out 00:24:41.197 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219821 s, 18.6 MB/s 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:41.197 08:00:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:24:41.767 7936+0 records in 00:24:41.767 7936+0 records out 00:24:41.767 32505856 bytes (33 MB, 31 MiB) copied, 0.559805 s, 58.1 MB/s 00:24:41.767 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:41.767 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:41.767 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:41.767 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:41.767 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:24:41.767 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:41.767 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:42.026 [2024-07-15 08:00:26.627597] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:42.026 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:42.285 [2024-07-15 08:00:26.820113] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.285 08:00:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.285 08:00:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:42.285 "name": "raid_bdev1", 00:24:42.285 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:42.285 "strip_size_kb": 0, 00:24:42.285 "state": "online", 00:24:42.285 "raid_level": "raid1", 00:24:42.285 "superblock": true, 00:24:42.285 "num_base_bdevs": 2, 00:24:42.285 "num_base_bdevs_discovered": 1, 00:24:42.285 "num_base_bdevs_operational": 1, 00:24:42.285 "base_bdevs_list": [ 00:24:42.285 { 00:24:42.285 "name": null, 00:24:42.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:42.285 "is_configured": false, 00:24:42.285 "data_offset": 256, 00:24:42.285 "data_size": 7936 00:24:42.285 }, 00:24:42.285 { 00:24:42.285 "name": "BaseBdev2", 00:24:42.285 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:42.285 "is_configured": true, 00:24:42.285 "data_offset": 256, 00:24:42.285 "data_size": 7936 00:24:42.285 } 00:24:42.285 ] 00:24:42.285 }' 00:24:42.285 08:00:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:42.285 08:00:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:42.855 08:00:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:43.116 [2024-07-15 08:00:27.738478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:43.116 [2024-07-15 08:00:27.741995] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10bfaa0 00:24:43.116 [2024-07-15 08:00:27.743602] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:43.116 08:00:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:44.056 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:44.056 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:44.056 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:44.056 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:44.056 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:44.056 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.056 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.316 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:44.316 "name": "raid_bdev1", 00:24:44.316 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:44.316 "strip_size_kb": 0, 00:24:44.316 "state": "online", 00:24:44.316 "raid_level": "raid1", 00:24:44.316 "superblock": true, 00:24:44.316 "num_base_bdevs": 2, 00:24:44.316 "num_base_bdevs_discovered": 2, 00:24:44.316 "num_base_bdevs_operational": 2, 00:24:44.316 "process": { 00:24:44.316 "type": "rebuild", 00:24:44.316 "target": "spare", 00:24:44.316 "progress": { 00:24:44.316 "blocks": 2816, 00:24:44.316 "percent": 35 00:24:44.316 } 00:24:44.316 }, 00:24:44.316 "base_bdevs_list": [ 00:24:44.316 { 00:24:44.316 "name": "spare", 00:24:44.316 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:44.316 "is_configured": true, 00:24:44.316 "data_offset": 256, 00:24:44.316 "data_size": 7936 00:24:44.316 }, 00:24:44.316 { 00:24:44.316 "name": "BaseBdev2", 00:24:44.316 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:44.316 "is_configured": true, 00:24:44.316 "data_offset": 256, 00:24:44.316 "data_size": 7936 00:24:44.316 } 00:24:44.316 ] 00:24:44.316 }' 00:24:44.316 08:00:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:44.316 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:44.316 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:44.316 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:44.316 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:44.575 [2024-07-15 08:00:29.224089] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:44.575 [2024-07-15 08:00:29.252451] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:44.575 [2024-07-15 08:00:29.252483] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.575 [2024-07-15 08:00:29.252493] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:44.575 [2024-07-15 08:00:29.252498] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.575 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.835 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.835 "name": "raid_bdev1", 00:24:44.835 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:44.835 "strip_size_kb": 0, 00:24:44.835 "state": "online", 00:24:44.835 "raid_level": "raid1", 00:24:44.835 "superblock": true, 00:24:44.835 "num_base_bdevs": 2, 00:24:44.835 "num_base_bdevs_discovered": 1, 00:24:44.835 "num_base_bdevs_operational": 1, 00:24:44.835 "base_bdevs_list": [ 00:24:44.835 { 00:24:44.835 "name": null, 00:24:44.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.835 "is_configured": false, 00:24:44.835 "data_offset": 256, 00:24:44.835 "data_size": 7936 00:24:44.835 }, 00:24:44.835 { 00:24:44.835 "name": "BaseBdev2", 00:24:44.835 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:44.835 "is_configured": true, 00:24:44.835 "data_offset": 256, 00:24:44.835 "data_size": 7936 00:24:44.835 } 00:24:44.835 ] 00:24:44.835 }' 00:24:44.835 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.835 08:00:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:45.404 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:45.404 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.404 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:45.404 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:45.404 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.404 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.404 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.665 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.665 "name": "raid_bdev1", 00:24:45.665 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:45.665 "strip_size_kb": 0, 00:24:45.665 "state": "online", 00:24:45.665 "raid_level": "raid1", 00:24:45.665 "superblock": true, 00:24:45.665 "num_base_bdevs": 2, 00:24:45.665 "num_base_bdevs_discovered": 1, 00:24:45.665 "num_base_bdevs_operational": 1, 00:24:45.665 "base_bdevs_list": [ 00:24:45.665 { 00:24:45.665 "name": null, 00:24:45.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.665 "is_configured": false, 00:24:45.665 "data_offset": 256, 00:24:45.665 "data_size": 7936 00:24:45.665 }, 00:24:45.665 { 00:24:45.665 "name": "BaseBdev2", 00:24:45.665 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:45.665 "is_configured": true, 00:24:45.665 "data_offset": 256, 00:24:45.665 "data_size": 7936 00:24:45.665 } 00:24:45.665 ] 00:24:45.665 }' 00:24:45.665 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.665 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:45.665 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.665 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.665 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:45.924 [2024-07-15 08:00:30.495507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:45.924 [2024-07-15 08:00:30.498753] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10bf9e0 00:24:45.924 [2024-07-15 08:00:30.499903] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:45.924 08:00:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:46.864 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.864 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.864 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.864 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.865 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.865 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.865 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.125 "name": "raid_bdev1", 00:24:47.125 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:47.125 "strip_size_kb": 0, 00:24:47.125 "state": "online", 00:24:47.125 "raid_level": "raid1", 00:24:47.125 "superblock": true, 00:24:47.125 "num_base_bdevs": 2, 00:24:47.125 "num_base_bdevs_discovered": 2, 00:24:47.125 "num_base_bdevs_operational": 2, 00:24:47.125 "process": { 00:24:47.125 "type": "rebuild", 00:24:47.125 "target": "spare", 00:24:47.125 "progress": { 00:24:47.125 "blocks": 2816, 00:24:47.125 "percent": 35 00:24:47.125 } 00:24:47.125 }, 00:24:47.125 "base_bdevs_list": [ 00:24:47.125 { 00:24:47.125 "name": "spare", 00:24:47.125 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:47.125 "is_configured": true, 00:24:47.125 "data_offset": 256, 00:24:47.125 "data_size": 7936 00:24:47.125 }, 00:24:47.125 { 00:24:47.125 "name": "BaseBdev2", 00:24:47.125 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:47.125 "is_configured": true, 00:24:47.125 "data_offset": 256, 00:24:47.125 "data_size": 7936 00:24:47.125 } 00:24:47.125 ] 00:24:47.125 }' 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:47.125 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=897 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.125 08:00:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.386 08:00:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.386 "name": "raid_bdev1", 00:24:47.386 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:47.386 "strip_size_kb": 0, 00:24:47.386 "state": "online", 00:24:47.386 "raid_level": "raid1", 00:24:47.386 "superblock": true, 00:24:47.386 "num_base_bdevs": 2, 00:24:47.386 "num_base_bdevs_discovered": 2, 00:24:47.386 "num_base_bdevs_operational": 2, 00:24:47.386 "process": { 00:24:47.386 "type": "rebuild", 00:24:47.386 "target": "spare", 00:24:47.386 "progress": { 00:24:47.386 "blocks": 3584, 00:24:47.386 "percent": 45 00:24:47.386 } 00:24:47.386 }, 00:24:47.386 "base_bdevs_list": [ 00:24:47.386 { 00:24:47.386 "name": "spare", 00:24:47.386 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:47.386 "is_configured": true, 00:24:47.386 "data_offset": 256, 00:24:47.386 "data_size": 7936 00:24:47.386 }, 00:24:47.386 { 00:24:47.386 "name": "BaseBdev2", 00:24:47.386 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:47.386 "is_configured": true, 00:24:47.386 "data_offset": 256, 00:24:47.386 "data_size": 7936 00:24:47.386 } 00:24:47.386 ] 00:24:47.386 }' 00:24:47.386 08:00:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.386 08:00:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:47.386 08:00:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.386 08:00:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:47.386 08:00:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.766 "name": "raid_bdev1", 00:24:48.766 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:48.766 "strip_size_kb": 0, 00:24:48.766 "state": "online", 00:24:48.766 "raid_level": "raid1", 00:24:48.766 "superblock": true, 00:24:48.766 "num_base_bdevs": 2, 00:24:48.766 "num_base_bdevs_discovered": 2, 00:24:48.766 "num_base_bdevs_operational": 2, 00:24:48.766 "process": { 00:24:48.766 "type": "rebuild", 00:24:48.766 "target": "spare", 00:24:48.766 "progress": { 00:24:48.766 "blocks": 6912, 00:24:48.766 "percent": 87 00:24:48.766 } 00:24:48.766 }, 00:24:48.766 "base_bdevs_list": [ 00:24:48.766 { 00:24:48.766 "name": "spare", 00:24:48.766 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:48.766 "is_configured": true, 00:24:48.766 "data_offset": 256, 00:24:48.766 "data_size": 7936 00:24:48.766 }, 00:24:48.766 { 00:24:48.766 "name": "BaseBdev2", 00:24:48.766 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:48.766 "is_configured": true, 00:24:48.766 "data_offset": 256, 00:24:48.766 "data_size": 7936 00:24:48.766 } 00:24:48.766 ] 00:24:48.766 }' 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:48.766 08:00:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:49.063 [2024-07-15 08:00:33.617938] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:49.063 [2024-07-15 08:00:33.617987] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:49.063 [2024-07-15 08:00:33.618050] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:49.634 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:49.634 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:49.634 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.634 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:49.634 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:49.634 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.894 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.894 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.894 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:49.894 "name": "raid_bdev1", 00:24:49.894 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:49.894 "strip_size_kb": 0, 00:24:49.894 "state": "online", 00:24:49.894 "raid_level": "raid1", 00:24:49.894 "superblock": true, 00:24:49.894 "num_base_bdevs": 2, 00:24:49.894 "num_base_bdevs_discovered": 2, 00:24:49.894 "num_base_bdevs_operational": 2, 00:24:49.894 "base_bdevs_list": [ 00:24:49.894 { 00:24:49.894 "name": "spare", 00:24:49.894 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:49.894 "is_configured": true, 00:24:49.894 "data_offset": 256, 00:24:49.894 "data_size": 7936 00:24:49.894 }, 00:24:49.894 { 00:24:49.894 "name": "BaseBdev2", 00:24:49.894 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:49.894 "is_configured": true, 00:24:49.894 "data_offset": 256, 00:24:49.894 "data_size": 7936 00:24:49.894 } 00:24:49.894 ] 00:24:49.894 }' 00:24:49.894 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:49.894 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:49.894 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:50.153 "name": "raid_bdev1", 00:24:50.153 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:50.153 "strip_size_kb": 0, 00:24:50.153 "state": "online", 00:24:50.153 "raid_level": "raid1", 00:24:50.153 "superblock": true, 00:24:50.153 "num_base_bdevs": 2, 00:24:50.153 "num_base_bdevs_discovered": 2, 00:24:50.153 "num_base_bdevs_operational": 2, 00:24:50.153 "base_bdevs_list": [ 00:24:50.153 { 00:24:50.153 "name": "spare", 00:24:50.153 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:50.153 "is_configured": true, 00:24:50.153 "data_offset": 256, 00:24:50.153 "data_size": 7936 00:24:50.153 }, 00:24:50.153 { 00:24:50.153 "name": "BaseBdev2", 00:24:50.153 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:50.153 "is_configured": true, 00:24:50.153 "data_offset": 256, 00:24:50.153 "data_size": 7936 00:24:50.153 } 00:24:50.153 ] 00:24:50.153 }' 00:24:50.153 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.413 08:00:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.413 08:00:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:50.413 "name": "raid_bdev1", 00:24:50.413 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:50.413 "strip_size_kb": 0, 00:24:50.413 "state": "online", 00:24:50.413 "raid_level": "raid1", 00:24:50.413 "superblock": true, 00:24:50.413 "num_base_bdevs": 2, 00:24:50.413 "num_base_bdevs_discovered": 2, 00:24:50.413 "num_base_bdevs_operational": 2, 00:24:50.413 "base_bdevs_list": [ 00:24:50.413 { 00:24:50.413 "name": "spare", 00:24:50.413 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:50.413 "is_configured": true, 00:24:50.413 "data_offset": 256, 00:24:50.413 "data_size": 7936 00:24:50.413 }, 00:24:50.413 { 00:24:50.413 "name": "BaseBdev2", 00:24:50.413 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:50.413 "is_configured": true, 00:24:50.413 "data_offset": 256, 00:24:50.413 "data_size": 7936 00:24:50.413 } 00:24:50.413 ] 00:24:50.413 }' 00:24:50.413 08:00:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:50.413 08:00:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:50.981 08:00:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:51.240 [2024-07-15 08:00:35.831750] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:51.240 [2024-07-15 08:00:35.831769] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:51.240 [2024-07-15 08:00:35.831812] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:51.240 [2024-07-15 08:00:35.831849] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:51.240 [2024-07-15 08:00:35.831855] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10c1390 name raid_bdev1, state offline 00:24:51.240 08:00:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.240 08:00:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:51.499 /dev/nbd0 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:51.499 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:51.760 1+0 records in 00:24:51.760 1+0 records out 00:24:51.760 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000153514 s, 26.7 MB/s 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:51.760 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:51.761 /dev/nbd1 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:51.761 1+0 records in 00:24:51.761 1+0 records out 00:24:51.761 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286514 s, 14.3 MB/s 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:51.761 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:52.020 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:52.280 08:00:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:52.540 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:52.799 [2024-07-15 08:00:37.305621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:52.799 [2024-07-15 08:00:37.305654] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.799 [2024-07-15 08:00:37.305666] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10bfd60 00:24:52.799 [2024-07-15 08:00:37.305672] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.799 [2024-07-15 08:00:37.307034] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.799 [2024-07-15 08:00:37.307055] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:52.799 [2024-07-15 08:00:37.307110] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:52.799 [2024-07-15 08:00:37.307130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:52.799 [2024-07-15 08:00:37.307206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:52.799 spare 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.799 [2024-07-15 08:00:37.407494] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf1d920 00:24:52.799 [2024-07-15 08:00:37.407503] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:52.799 [2024-07-15 08:00:37.407650] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1e920 00:24:52.799 [2024-07-15 08:00:37.407764] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf1d920 00:24:52.799 [2024-07-15 08:00:37.407770] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf1d920 00:24:52.799 [2024-07-15 08:00:37.407843] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:52.799 "name": "raid_bdev1", 00:24:52.799 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:52.799 "strip_size_kb": 0, 00:24:52.799 "state": "online", 00:24:52.799 "raid_level": "raid1", 00:24:52.799 "superblock": true, 00:24:52.799 "num_base_bdevs": 2, 00:24:52.799 "num_base_bdevs_discovered": 2, 00:24:52.799 "num_base_bdevs_operational": 2, 00:24:52.799 "base_bdevs_list": [ 00:24:52.799 { 00:24:52.799 "name": "spare", 00:24:52.799 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:52.799 "is_configured": true, 00:24:52.799 "data_offset": 256, 00:24:52.799 "data_size": 7936 00:24:52.799 }, 00:24:52.799 { 00:24:52.799 "name": "BaseBdev2", 00:24:52.799 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:52.799 "is_configured": true, 00:24:52.799 "data_offset": 256, 00:24:52.799 "data_size": 7936 00:24:52.799 } 00:24:52.799 ] 00:24:52.799 }' 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:52.799 08:00:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:53.368 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:53.368 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:53.368 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:53.368 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:53.368 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:53.368 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.368 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.627 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:53.627 "name": "raid_bdev1", 00:24:53.627 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:53.627 "strip_size_kb": 0, 00:24:53.627 "state": "online", 00:24:53.627 "raid_level": "raid1", 00:24:53.627 "superblock": true, 00:24:53.627 "num_base_bdevs": 2, 00:24:53.627 "num_base_bdevs_discovered": 2, 00:24:53.627 "num_base_bdevs_operational": 2, 00:24:53.627 "base_bdevs_list": [ 00:24:53.627 { 00:24:53.627 "name": "spare", 00:24:53.627 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:53.627 "is_configured": true, 00:24:53.627 "data_offset": 256, 00:24:53.627 "data_size": 7936 00:24:53.627 }, 00:24:53.627 { 00:24:53.627 "name": "BaseBdev2", 00:24:53.627 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:53.627 "is_configured": true, 00:24:53.627 "data_offset": 256, 00:24:53.627 "data_size": 7936 00:24:53.627 } 00:24:53.627 ] 00:24:53.627 }' 00:24:53.627 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:53.627 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:53.627 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:53.627 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:53.627 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.627 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:53.886 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:53.886 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:54.147 [2024-07-15 08:00:38.717283] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.147 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.407 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:54.407 "name": "raid_bdev1", 00:24:54.407 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:54.407 "strip_size_kb": 0, 00:24:54.407 "state": "online", 00:24:54.407 "raid_level": "raid1", 00:24:54.407 "superblock": true, 00:24:54.407 "num_base_bdevs": 2, 00:24:54.407 "num_base_bdevs_discovered": 1, 00:24:54.407 "num_base_bdevs_operational": 1, 00:24:54.407 "base_bdevs_list": [ 00:24:54.407 { 00:24:54.407 "name": null, 00:24:54.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.407 "is_configured": false, 00:24:54.407 "data_offset": 256, 00:24:54.407 "data_size": 7936 00:24:54.407 }, 00:24:54.407 { 00:24:54.407 "name": "BaseBdev2", 00:24:54.407 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:54.407 "is_configured": true, 00:24:54.407 "data_offset": 256, 00:24:54.407 "data_size": 7936 00:24:54.407 } 00:24:54.407 ] 00:24:54.407 }' 00:24:54.407 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:54.407 08:00:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:54.977 08:00:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:54.977 [2024-07-15 08:00:39.631610] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:54.977 [2024-07-15 08:00:39.631719] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:54.977 [2024-07-15 08:00:39.631728] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:54.977 [2024-07-15 08:00:39.631745] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:54.977 [2024-07-15 08:00:39.635070] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf27580 00:24:54.977 [2024-07-15 08:00:39.636697] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:54.977 08:00:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:55.917 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:55.917 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.917 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:55.917 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:55.917 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.917 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.917 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.177 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:56.177 "name": "raid_bdev1", 00:24:56.177 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:56.177 "strip_size_kb": 0, 00:24:56.177 "state": "online", 00:24:56.177 "raid_level": "raid1", 00:24:56.177 "superblock": true, 00:24:56.177 "num_base_bdevs": 2, 00:24:56.177 "num_base_bdevs_discovered": 2, 00:24:56.177 "num_base_bdevs_operational": 2, 00:24:56.177 "process": { 00:24:56.177 "type": "rebuild", 00:24:56.177 "target": "spare", 00:24:56.177 "progress": { 00:24:56.177 "blocks": 2816, 00:24:56.177 "percent": 35 00:24:56.177 } 00:24:56.177 }, 00:24:56.177 "base_bdevs_list": [ 00:24:56.177 { 00:24:56.177 "name": "spare", 00:24:56.177 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:56.177 "is_configured": true, 00:24:56.177 "data_offset": 256, 00:24:56.177 "data_size": 7936 00:24:56.177 }, 00:24:56.177 { 00:24:56.177 "name": "BaseBdev2", 00:24:56.177 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:56.177 "is_configured": true, 00:24:56.177 "data_offset": 256, 00:24:56.177 "data_size": 7936 00:24:56.177 } 00:24:56.177 ] 00:24:56.177 }' 00:24:56.177 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:56.177 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:56.177 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:56.438 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:56.438 08:00:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:56.438 [2024-07-15 08:00:41.109124] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:56.438 [2024-07-15 08:00:41.145462] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:56.438 [2024-07-15 08:00:41.145492] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:56.438 [2024-07-15 08:00:41.145501] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:56.438 [2024-07-15 08:00:41.145506] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.438 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.698 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:56.698 "name": "raid_bdev1", 00:24:56.698 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:56.698 "strip_size_kb": 0, 00:24:56.698 "state": "online", 00:24:56.698 "raid_level": "raid1", 00:24:56.698 "superblock": true, 00:24:56.698 "num_base_bdevs": 2, 00:24:56.698 "num_base_bdevs_discovered": 1, 00:24:56.698 "num_base_bdevs_operational": 1, 00:24:56.698 "base_bdevs_list": [ 00:24:56.698 { 00:24:56.698 "name": null, 00:24:56.698 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.698 "is_configured": false, 00:24:56.698 "data_offset": 256, 00:24:56.698 "data_size": 7936 00:24:56.698 }, 00:24:56.698 { 00:24:56.698 "name": "BaseBdev2", 00:24:56.698 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:56.698 "is_configured": true, 00:24:56.698 "data_offset": 256, 00:24:56.698 "data_size": 7936 00:24:56.698 } 00:24:56.698 ] 00:24:56.698 }' 00:24:56.698 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:56.698 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:57.268 08:00:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:57.528 [2024-07-15 08:00:42.075633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:57.528 [2024-07-15 08:00:42.075668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:57.528 [2024-07-15 08:00:42.075684] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1dca0 00:24:57.528 [2024-07-15 08:00:42.075691] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:57.528 [2024-07-15 08:00:42.076002] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:57.528 [2024-07-15 08:00:42.076015] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:57.528 [2024-07-15 08:00:42.076073] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:57.528 [2024-07-15 08:00:42.076080] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:57.528 [2024-07-15 08:00:42.076086] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:57.528 [2024-07-15 08:00:42.076097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:57.528 [2024-07-15 08:00:42.079361] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1e900 00:24:57.528 [2024-07-15 08:00:42.080519] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:57.528 spare 00:24:57.528 08:00:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:58.469 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:58.469 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.469 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:58.469 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:58.469 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.469 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.469 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.729 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.729 "name": "raid_bdev1", 00:24:58.729 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:58.729 "strip_size_kb": 0, 00:24:58.729 "state": "online", 00:24:58.729 "raid_level": "raid1", 00:24:58.729 "superblock": true, 00:24:58.729 "num_base_bdevs": 2, 00:24:58.729 "num_base_bdevs_discovered": 2, 00:24:58.729 "num_base_bdevs_operational": 2, 00:24:58.729 "process": { 00:24:58.729 "type": "rebuild", 00:24:58.729 "target": "spare", 00:24:58.729 "progress": { 00:24:58.729 "blocks": 2816, 00:24:58.729 "percent": 35 00:24:58.729 } 00:24:58.729 }, 00:24:58.729 "base_bdevs_list": [ 00:24:58.729 { 00:24:58.729 "name": "spare", 00:24:58.729 "uuid": "015c157f-f47c-5351-a1bb-5b775459b45c", 00:24:58.729 "is_configured": true, 00:24:58.729 "data_offset": 256, 00:24:58.729 "data_size": 7936 00:24:58.729 }, 00:24:58.729 { 00:24:58.729 "name": "BaseBdev2", 00:24:58.729 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:58.729 "is_configured": true, 00:24:58.729 "data_offset": 256, 00:24:58.729 "data_size": 7936 00:24:58.729 } 00:24:58.729 ] 00:24:58.729 }' 00:24:58.729 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.729 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:58.729 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.729 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:58.729 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:58.987 [2024-07-15 08:00:43.569023] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:58.987 [2024-07-15 08:00:43.589315] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:58.987 [2024-07-15 08:00:43.589344] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:58.987 [2024-07-15 08:00:43.589353] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:58.987 [2024-07-15 08:00:43.589357] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:58.987 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:58.987 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:58.987 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:58.987 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:58.988 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:58.988 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:58.988 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:58.988 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:58.988 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:58.988 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:58.988 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.988 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.247 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:59.247 "name": "raid_bdev1", 00:24:59.247 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:59.247 "strip_size_kb": 0, 00:24:59.247 "state": "online", 00:24:59.247 "raid_level": "raid1", 00:24:59.247 "superblock": true, 00:24:59.247 "num_base_bdevs": 2, 00:24:59.247 "num_base_bdevs_discovered": 1, 00:24:59.247 "num_base_bdevs_operational": 1, 00:24:59.247 "base_bdevs_list": [ 00:24:59.247 { 00:24:59.247 "name": null, 00:24:59.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.247 "is_configured": false, 00:24:59.247 "data_offset": 256, 00:24:59.247 "data_size": 7936 00:24:59.247 }, 00:24:59.247 { 00:24:59.247 "name": "BaseBdev2", 00:24:59.247 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:59.247 "is_configured": true, 00:24:59.247 "data_offset": 256, 00:24:59.247 "data_size": 7936 00:24:59.247 } 00:24:59.247 ] 00:24:59.247 }' 00:24:59.247 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:59.247 08:00:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.815 "name": "raid_bdev1", 00:24:59.815 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:24:59.815 "strip_size_kb": 0, 00:24:59.815 "state": "online", 00:24:59.815 "raid_level": "raid1", 00:24:59.815 "superblock": true, 00:24:59.815 "num_base_bdevs": 2, 00:24:59.815 "num_base_bdevs_discovered": 1, 00:24:59.815 "num_base_bdevs_operational": 1, 00:24:59.815 "base_bdevs_list": [ 00:24:59.815 { 00:24:59.815 "name": null, 00:24:59.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.815 "is_configured": false, 00:24:59.815 "data_offset": 256, 00:24:59.815 "data_size": 7936 00:24:59.815 }, 00:24:59.815 { 00:24:59.815 "name": "BaseBdev2", 00:24:59.815 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:24:59.815 "is_configured": true, 00:24:59.815 "data_offset": 256, 00:24:59.815 "data_size": 7936 00:24:59.815 } 00:24:59.815 ] 00:24:59.815 }' 00:24:59.815 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.074 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:00.074 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.074 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:00.074 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:00.334 08:00:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:00.334 [2024-07-15 08:00:45.028674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:00.334 [2024-07-15 08:00:45.028706] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:00.334 [2024-07-15 08:00:45.028723] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf26f60 00:25:00.334 [2024-07-15 08:00:45.028729] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:00.334 [2024-07-15 08:00:45.029009] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:00.334 [2024-07-15 08:00:45.029020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:00.334 [2024-07-15 08:00:45.029066] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:00.334 [2024-07-15 08:00:45.029073] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:00.334 [2024-07-15 08:00:45.029079] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:00.334 BaseBdev1 00:25:00.334 08:00:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:01.714 "name": "raid_bdev1", 00:25:01.714 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:25:01.714 "strip_size_kb": 0, 00:25:01.714 "state": "online", 00:25:01.714 "raid_level": "raid1", 00:25:01.714 "superblock": true, 00:25:01.714 "num_base_bdevs": 2, 00:25:01.714 "num_base_bdevs_discovered": 1, 00:25:01.714 "num_base_bdevs_operational": 1, 00:25:01.714 "base_bdevs_list": [ 00:25:01.714 { 00:25:01.714 "name": null, 00:25:01.714 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.714 "is_configured": false, 00:25:01.714 "data_offset": 256, 00:25:01.714 "data_size": 7936 00:25:01.714 }, 00:25:01.714 { 00:25:01.714 "name": "BaseBdev2", 00:25:01.714 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:25:01.714 "is_configured": true, 00:25:01.714 "data_offset": 256, 00:25:01.714 "data_size": 7936 00:25:01.714 } 00:25:01.714 ] 00:25:01.714 }' 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:01.714 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:02.284 "name": "raid_bdev1", 00:25:02.284 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:25:02.284 "strip_size_kb": 0, 00:25:02.284 "state": "online", 00:25:02.284 "raid_level": "raid1", 00:25:02.284 "superblock": true, 00:25:02.284 "num_base_bdevs": 2, 00:25:02.284 "num_base_bdevs_discovered": 1, 00:25:02.284 "num_base_bdevs_operational": 1, 00:25:02.284 "base_bdevs_list": [ 00:25:02.284 { 00:25:02.284 "name": null, 00:25:02.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.284 "is_configured": false, 00:25:02.284 "data_offset": 256, 00:25:02.284 "data_size": 7936 00:25:02.284 }, 00:25:02.284 { 00:25:02.284 "name": "BaseBdev2", 00:25:02.284 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:25:02.284 "is_configured": true, 00:25:02.284 "data_offset": 256, 00:25:02.284 "data_size": 7936 00:25:02.284 } 00:25:02.284 ] 00:25:02.284 }' 00:25:02.284 08:00:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:02.544 [2024-07-15 08:00:47.266431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:02.544 [2024-07-15 08:00:47.266524] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:02.544 [2024-07-15 08:00:47.266533] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:02.544 request: 00:25:02.544 { 00:25:02.544 "base_bdev": "BaseBdev1", 00:25:02.544 "raid_bdev": "raid_bdev1", 00:25:02.544 "method": "bdev_raid_add_base_bdev", 00:25:02.544 "req_id": 1 00:25:02.544 } 00:25:02.544 Got JSON-RPC error response 00:25:02.544 response: 00:25:02.544 { 00:25:02.544 "code": -22, 00:25:02.544 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:02.544 } 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:02.544 08:00:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:03.933 "name": "raid_bdev1", 00:25:03.933 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:25:03.933 "strip_size_kb": 0, 00:25:03.933 "state": "online", 00:25:03.933 "raid_level": "raid1", 00:25:03.933 "superblock": true, 00:25:03.933 "num_base_bdevs": 2, 00:25:03.933 "num_base_bdevs_discovered": 1, 00:25:03.933 "num_base_bdevs_operational": 1, 00:25:03.933 "base_bdevs_list": [ 00:25:03.933 { 00:25:03.933 "name": null, 00:25:03.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.933 "is_configured": false, 00:25:03.933 "data_offset": 256, 00:25:03.933 "data_size": 7936 00:25:03.933 }, 00:25:03.933 { 00:25:03.933 "name": "BaseBdev2", 00:25:03.933 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:25:03.933 "is_configured": true, 00:25:03.933 "data_offset": 256, 00:25:03.933 "data_size": 7936 00:25:03.933 } 00:25:03.933 ] 00:25:03.933 }' 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:03.933 08:00:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:04.504 "name": "raid_bdev1", 00:25:04.504 "uuid": "43a86a6f-9d92-4446-9bc8-e7951a625306", 00:25:04.504 "strip_size_kb": 0, 00:25:04.504 "state": "online", 00:25:04.504 "raid_level": "raid1", 00:25:04.504 "superblock": true, 00:25:04.504 "num_base_bdevs": 2, 00:25:04.504 "num_base_bdevs_discovered": 1, 00:25:04.504 "num_base_bdevs_operational": 1, 00:25:04.504 "base_bdevs_list": [ 00:25:04.504 { 00:25:04.504 "name": null, 00:25:04.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.504 "is_configured": false, 00:25:04.504 "data_offset": 256, 00:25:04.504 "data_size": 7936 00:25:04.504 }, 00:25:04.504 { 00:25:04.504 "name": "BaseBdev2", 00:25:04.504 "uuid": "e09b2543-b711-58af-b374-81793297ddf8", 00:25:04.504 "is_configured": true, 00:25:04.504 "data_offset": 256, 00:25:04.504 "data_size": 7936 00:25:04.504 } 00:25:04.504 ] 00:25:04.504 }' 00:25:04.504 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.764 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:04.764 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.764 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:04.764 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 1746596 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1746596 ']' 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1746596 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1746596 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1746596' 00:25:04.765 killing process with pid 1746596 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1746596 00:25:04.765 Received shutdown signal, test time was about 60.000000 seconds 00:25:04.765 00:25:04.765 Latency(us) 00:25:04.765 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:04.765 =================================================================================================================== 00:25:04.765 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:04.765 [2024-07-15 08:00:49.376309] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:04.765 [2024-07-15 08:00:49.376376] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:04.765 [2024-07-15 08:00:49.376409] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:04.765 [2024-07-15 08:00:49.376415] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf1d920 name raid_bdev1, state offline 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1746596 00:25:04.765 [2024-07-15 08:00:49.391384] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:25:04.765 00:25:04.765 real 0m27.410s 00:25:04.765 user 0m43.000s 00:25:04.765 sys 0m3.484s 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:04.765 08:00:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:04.765 ************************************ 00:25:04.765 END TEST raid_rebuild_test_sb_4k 00:25:04.765 ************************************ 00:25:05.025 08:00:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:05.025 08:00:49 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:25:05.025 08:00:49 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:25:05.025 08:00:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:05.025 08:00:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:05.025 08:00:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:05.025 ************************************ 00:25:05.025 START TEST raid_state_function_test_sb_md_separate 00:25:05.025 ************************************ 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1751566 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1751566' 00:25:05.025 Process raid pid: 1751566 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1751566 /var/tmp/spdk-raid.sock 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1751566 ']' 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:05.025 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:05.025 08:00:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:05.025 [2024-07-15 08:00:49.656588] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:25:05.025 [2024-07-15 08:00:49.656637] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:05.025 [2024-07-15 08:00:49.745220] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:05.285 [2024-07-15 08:00:49.809128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:05.285 [2024-07-15 08:00:49.857037] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:05.285 [2024-07-15 08:00:49.857057] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:05.852 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:05.852 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:25:05.853 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:06.112 [2024-07-15 08:00:50.649150] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:06.112 [2024-07-15 08:00:50.649179] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:06.112 [2024-07-15 08:00:50.649185] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:06.112 [2024-07-15 08:00:50.649191] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:06.112 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:06.112 "name": "Existed_Raid", 00:25:06.112 "uuid": "459ae7b9-824d-4364-84df-4de9eb781848", 00:25:06.112 "strip_size_kb": 0, 00:25:06.112 "state": "configuring", 00:25:06.112 "raid_level": "raid1", 00:25:06.112 "superblock": true, 00:25:06.112 "num_base_bdevs": 2, 00:25:06.112 "num_base_bdevs_discovered": 0, 00:25:06.112 "num_base_bdevs_operational": 2, 00:25:06.112 "base_bdevs_list": [ 00:25:06.112 { 00:25:06.112 "name": "BaseBdev1", 00:25:06.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.112 "is_configured": false, 00:25:06.112 "data_offset": 0, 00:25:06.112 "data_size": 0 00:25:06.112 }, 00:25:06.112 { 00:25:06.112 "name": "BaseBdev2", 00:25:06.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.112 "is_configured": false, 00:25:06.112 "data_offset": 0, 00:25:06.112 "data_size": 0 00:25:06.113 } 00:25:06.113 ] 00:25:06.113 }' 00:25:06.113 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:06.113 08:00:50 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:06.682 08:00:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:07.276 [2024-07-15 08:00:51.900193] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:07.276 [2024-07-15 08:00:51.900211] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa246b0 name Existed_Raid, state configuring 00:25:07.276 08:00:51 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:07.847 [2024-07-15 08:00:52.441560] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:07.847 [2024-07-15 08:00:52.441578] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:07.847 [2024-07-15 08:00:52.441584] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:07.847 [2024-07-15 08:00:52.441589] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:07.847 08:00:52 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:25:08.416 [2024-07-15 08:00:52.989949] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:08.416 BaseBdev1 00:25:08.416 08:00:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:08.416 08:00:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:25:08.416 08:00:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:08.416 08:00:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:25:08.416 08:00:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:08.416 08:00:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:08.416 08:00:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:08.984 08:00:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:09.553 [ 00:25:09.553 { 00:25:09.553 "name": "BaseBdev1", 00:25:09.553 "aliases": [ 00:25:09.553 "7fb179fc-95bb-4ff3-b546-84d9ea93e046" 00:25:09.553 ], 00:25:09.553 "product_name": "Malloc disk", 00:25:09.553 "block_size": 4096, 00:25:09.553 "num_blocks": 8192, 00:25:09.553 "uuid": "7fb179fc-95bb-4ff3-b546-84d9ea93e046", 00:25:09.553 "md_size": 32, 00:25:09.553 "md_interleave": false, 00:25:09.553 "dif_type": 0, 00:25:09.553 "assigned_rate_limits": { 00:25:09.553 "rw_ios_per_sec": 0, 00:25:09.553 "rw_mbytes_per_sec": 0, 00:25:09.553 "r_mbytes_per_sec": 0, 00:25:09.553 "w_mbytes_per_sec": 0 00:25:09.553 }, 00:25:09.553 "claimed": true, 00:25:09.553 "claim_type": "exclusive_write", 00:25:09.553 "zoned": false, 00:25:09.553 "supported_io_types": { 00:25:09.553 "read": true, 00:25:09.553 "write": true, 00:25:09.553 "unmap": true, 00:25:09.553 "flush": true, 00:25:09.553 "reset": true, 00:25:09.553 "nvme_admin": false, 00:25:09.553 "nvme_io": false, 00:25:09.553 "nvme_io_md": false, 00:25:09.553 "write_zeroes": true, 00:25:09.553 "zcopy": true, 00:25:09.553 "get_zone_info": false, 00:25:09.553 "zone_management": false, 00:25:09.553 "zone_append": false, 00:25:09.553 "compare": false, 00:25:09.553 "compare_and_write": false, 00:25:09.553 "abort": true, 00:25:09.553 "seek_hole": false, 00:25:09.553 "seek_data": false, 00:25:09.553 "copy": true, 00:25:09.553 "nvme_iov_md": false 00:25:09.553 }, 00:25:09.553 "memory_domains": [ 00:25:09.553 { 00:25:09.553 "dma_device_id": "system", 00:25:09.553 "dma_device_type": 1 00:25:09.553 }, 00:25:09.553 { 00:25:09.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:09.553 "dma_device_type": 2 00:25:09.553 } 00:25:09.553 ], 00:25:09.553 "driver_specific": {} 00:25:09.553 } 00:25:09.553 ] 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:09.553 "name": "Existed_Raid", 00:25:09.553 "uuid": "363df13c-e870-45b6-9ec5-894e9bdafa6c", 00:25:09.553 "strip_size_kb": 0, 00:25:09.553 "state": "configuring", 00:25:09.553 "raid_level": "raid1", 00:25:09.553 "superblock": true, 00:25:09.553 "num_base_bdevs": 2, 00:25:09.553 "num_base_bdevs_discovered": 1, 00:25:09.553 "num_base_bdevs_operational": 2, 00:25:09.553 "base_bdevs_list": [ 00:25:09.553 { 00:25:09.553 "name": "BaseBdev1", 00:25:09.553 "uuid": "7fb179fc-95bb-4ff3-b546-84d9ea93e046", 00:25:09.553 "is_configured": true, 00:25:09.553 "data_offset": 256, 00:25:09.553 "data_size": 7936 00:25:09.553 }, 00:25:09.553 { 00:25:09.553 "name": "BaseBdev2", 00:25:09.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:09.553 "is_configured": false, 00:25:09.553 "data_offset": 0, 00:25:09.553 "data_size": 0 00:25:09.553 } 00:25:09.553 ] 00:25:09.553 }' 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:09.553 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:10.124 08:00:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:10.384 [2024-07-15 08:00:55.019101] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:10.384 [2024-07-15 08:00:55.019130] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa23fa0 name Existed_Raid, state configuring 00:25:10.384 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:10.644 [2024-07-15 08:00:55.203594] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:10.644 [2024-07-15 08:00:55.204687] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:10.644 [2024-07-15 08:00:55.204714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.644 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:10.903 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.903 "name": "Existed_Raid", 00:25:10.903 "uuid": "ad54ef31-3b20-48a3-a67e-f012dd2ca10f", 00:25:10.903 "strip_size_kb": 0, 00:25:10.903 "state": "configuring", 00:25:10.903 "raid_level": "raid1", 00:25:10.903 "superblock": true, 00:25:10.903 "num_base_bdevs": 2, 00:25:10.903 "num_base_bdevs_discovered": 1, 00:25:10.903 "num_base_bdevs_operational": 2, 00:25:10.903 "base_bdevs_list": [ 00:25:10.903 { 00:25:10.903 "name": "BaseBdev1", 00:25:10.903 "uuid": "7fb179fc-95bb-4ff3-b546-84d9ea93e046", 00:25:10.903 "is_configured": true, 00:25:10.903 "data_offset": 256, 00:25:10.903 "data_size": 7936 00:25:10.903 }, 00:25:10.903 { 00:25:10.903 "name": "BaseBdev2", 00:25:10.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.903 "is_configured": false, 00:25:10.903 "data_offset": 0, 00:25:10.903 "data_size": 0 00:25:10.903 } 00:25:10.903 ] 00:25:10.903 }' 00:25:10.903 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.903 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:11.473 08:00:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:25:11.473 [2024-07-15 08:00:56.127323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:11.473 [2024-07-15 08:00:56.127426] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa25ed0 00:25:11.473 [2024-07-15 08:00:56.127433] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:11.473 [2024-07-15 08:00:56.127475] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa24540 00:25:11.473 [2024-07-15 08:00:56.127554] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa25ed0 00:25:11.473 [2024-07-15 08:00:56.127560] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa25ed0 00:25:11.473 [2024-07-15 08:00:56.127607] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:11.473 BaseBdev2 00:25:11.473 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:11.473 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:25:11.473 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:11.473 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:25:11.473 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:11.473 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:11.473 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:11.733 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:11.992 [ 00:25:11.992 { 00:25:11.992 "name": "BaseBdev2", 00:25:11.992 "aliases": [ 00:25:11.992 "08dc4f7b-3421-44f7-be86-a6d4061f5881" 00:25:11.992 ], 00:25:11.992 "product_name": "Malloc disk", 00:25:11.992 "block_size": 4096, 00:25:11.992 "num_blocks": 8192, 00:25:11.992 "uuid": "08dc4f7b-3421-44f7-be86-a6d4061f5881", 00:25:11.992 "md_size": 32, 00:25:11.992 "md_interleave": false, 00:25:11.992 "dif_type": 0, 00:25:11.992 "assigned_rate_limits": { 00:25:11.992 "rw_ios_per_sec": 0, 00:25:11.992 "rw_mbytes_per_sec": 0, 00:25:11.992 "r_mbytes_per_sec": 0, 00:25:11.992 "w_mbytes_per_sec": 0 00:25:11.992 }, 00:25:11.992 "claimed": true, 00:25:11.992 "claim_type": "exclusive_write", 00:25:11.992 "zoned": false, 00:25:11.992 "supported_io_types": { 00:25:11.992 "read": true, 00:25:11.993 "write": true, 00:25:11.993 "unmap": true, 00:25:11.993 "flush": true, 00:25:11.993 "reset": true, 00:25:11.993 "nvme_admin": false, 00:25:11.993 "nvme_io": false, 00:25:11.993 "nvme_io_md": false, 00:25:11.993 "write_zeroes": true, 00:25:11.993 "zcopy": true, 00:25:11.993 "get_zone_info": false, 00:25:11.993 "zone_management": false, 00:25:11.993 "zone_append": false, 00:25:11.993 "compare": false, 00:25:11.993 "compare_and_write": false, 00:25:11.993 "abort": true, 00:25:11.993 "seek_hole": false, 00:25:11.993 "seek_data": false, 00:25:11.993 "copy": true, 00:25:11.993 "nvme_iov_md": false 00:25:11.993 }, 00:25:11.993 "memory_domains": [ 00:25:11.993 { 00:25:11.993 "dma_device_id": "system", 00:25:11.993 "dma_device_type": 1 00:25:11.993 }, 00:25:11.993 { 00:25:11.993 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:11.993 "dma_device_type": 2 00:25:11.993 } 00:25:11.993 ], 00:25:11.993 "driver_specific": {} 00:25:11.993 } 00:25:11.993 ] 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:11.993 "name": "Existed_Raid", 00:25:11.993 "uuid": "ad54ef31-3b20-48a3-a67e-f012dd2ca10f", 00:25:11.993 "strip_size_kb": 0, 00:25:11.993 "state": "online", 00:25:11.993 "raid_level": "raid1", 00:25:11.993 "superblock": true, 00:25:11.993 "num_base_bdevs": 2, 00:25:11.993 "num_base_bdevs_discovered": 2, 00:25:11.993 "num_base_bdevs_operational": 2, 00:25:11.993 "base_bdevs_list": [ 00:25:11.993 { 00:25:11.993 "name": "BaseBdev1", 00:25:11.993 "uuid": "7fb179fc-95bb-4ff3-b546-84d9ea93e046", 00:25:11.993 "is_configured": true, 00:25:11.993 "data_offset": 256, 00:25:11.993 "data_size": 7936 00:25:11.993 }, 00:25:11.993 { 00:25:11.993 "name": "BaseBdev2", 00:25:11.993 "uuid": "08dc4f7b-3421-44f7-be86-a6d4061f5881", 00:25:11.993 "is_configured": true, 00:25:11.993 "data_offset": 256, 00:25:11.993 "data_size": 7936 00:25:11.993 } 00:25:11.993 ] 00:25:11.993 }' 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:11.993 08:00:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:12.562 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:12.562 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:12.562 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:12.562 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:12.562 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:12.562 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:25:12.562 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:12.562 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:12.822 [2024-07-15 08:00:57.422842] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:12.822 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:12.822 "name": "Existed_Raid", 00:25:12.822 "aliases": [ 00:25:12.822 "ad54ef31-3b20-48a3-a67e-f012dd2ca10f" 00:25:12.822 ], 00:25:12.822 "product_name": "Raid Volume", 00:25:12.822 "block_size": 4096, 00:25:12.822 "num_blocks": 7936, 00:25:12.822 "uuid": "ad54ef31-3b20-48a3-a67e-f012dd2ca10f", 00:25:12.822 "md_size": 32, 00:25:12.822 "md_interleave": false, 00:25:12.822 "dif_type": 0, 00:25:12.822 "assigned_rate_limits": { 00:25:12.822 "rw_ios_per_sec": 0, 00:25:12.822 "rw_mbytes_per_sec": 0, 00:25:12.822 "r_mbytes_per_sec": 0, 00:25:12.822 "w_mbytes_per_sec": 0 00:25:12.822 }, 00:25:12.822 "claimed": false, 00:25:12.822 "zoned": false, 00:25:12.822 "supported_io_types": { 00:25:12.822 "read": true, 00:25:12.822 "write": true, 00:25:12.822 "unmap": false, 00:25:12.822 "flush": false, 00:25:12.822 "reset": true, 00:25:12.822 "nvme_admin": false, 00:25:12.822 "nvme_io": false, 00:25:12.822 "nvme_io_md": false, 00:25:12.822 "write_zeroes": true, 00:25:12.822 "zcopy": false, 00:25:12.822 "get_zone_info": false, 00:25:12.822 "zone_management": false, 00:25:12.822 "zone_append": false, 00:25:12.822 "compare": false, 00:25:12.822 "compare_and_write": false, 00:25:12.822 "abort": false, 00:25:12.822 "seek_hole": false, 00:25:12.822 "seek_data": false, 00:25:12.822 "copy": false, 00:25:12.822 "nvme_iov_md": false 00:25:12.822 }, 00:25:12.822 "memory_domains": [ 00:25:12.822 { 00:25:12.822 "dma_device_id": "system", 00:25:12.822 "dma_device_type": 1 00:25:12.822 }, 00:25:12.822 { 00:25:12.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:12.822 "dma_device_type": 2 00:25:12.822 }, 00:25:12.822 { 00:25:12.822 "dma_device_id": "system", 00:25:12.822 "dma_device_type": 1 00:25:12.822 }, 00:25:12.822 { 00:25:12.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:12.822 "dma_device_type": 2 00:25:12.822 } 00:25:12.822 ], 00:25:12.822 "driver_specific": { 00:25:12.822 "raid": { 00:25:12.822 "uuid": "ad54ef31-3b20-48a3-a67e-f012dd2ca10f", 00:25:12.822 "strip_size_kb": 0, 00:25:12.822 "state": "online", 00:25:12.822 "raid_level": "raid1", 00:25:12.822 "superblock": true, 00:25:12.822 "num_base_bdevs": 2, 00:25:12.822 "num_base_bdevs_discovered": 2, 00:25:12.822 "num_base_bdevs_operational": 2, 00:25:12.822 "base_bdevs_list": [ 00:25:12.822 { 00:25:12.822 "name": "BaseBdev1", 00:25:12.822 "uuid": "7fb179fc-95bb-4ff3-b546-84d9ea93e046", 00:25:12.822 "is_configured": true, 00:25:12.822 "data_offset": 256, 00:25:12.822 "data_size": 7936 00:25:12.822 }, 00:25:12.822 { 00:25:12.822 "name": "BaseBdev2", 00:25:12.822 "uuid": "08dc4f7b-3421-44f7-be86-a6d4061f5881", 00:25:12.822 "is_configured": true, 00:25:12.822 "data_offset": 256, 00:25:12.822 "data_size": 7936 00:25:12.822 } 00:25:12.822 ] 00:25:12.822 } 00:25:12.822 } 00:25:12.822 }' 00:25:12.822 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:12.822 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:12.822 BaseBdev2' 00:25:12.823 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:12.823 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:12.823 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:13.082 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:13.082 "name": "BaseBdev1", 00:25:13.082 "aliases": [ 00:25:13.082 "7fb179fc-95bb-4ff3-b546-84d9ea93e046" 00:25:13.082 ], 00:25:13.082 "product_name": "Malloc disk", 00:25:13.082 "block_size": 4096, 00:25:13.082 "num_blocks": 8192, 00:25:13.082 "uuid": "7fb179fc-95bb-4ff3-b546-84d9ea93e046", 00:25:13.082 "md_size": 32, 00:25:13.082 "md_interleave": false, 00:25:13.083 "dif_type": 0, 00:25:13.083 "assigned_rate_limits": { 00:25:13.083 "rw_ios_per_sec": 0, 00:25:13.083 "rw_mbytes_per_sec": 0, 00:25:13.083 "r_mbytes_per_sec": 0, 00:25:13.083 "w_mbytes_per_sec": 0 00:25:13.083 }, 00:25:13.083 "claimed": true, 00:25:13.083 "claim_type": "exclusive_write", 00:25:13.083 "zoned": false, 00:25:13.083 "supported_io_types": { 00:25:13.083 "read": true, 00:25:13.083 "write": true, 00:25:13.083 "unmap": true, 00:25:13.083 "flush": true, 00:25:13.083 "reset": true, 00:25:13.083 "nvme_admin": false, 00:25:13.083 "nvme_io": false, 00:25:13.083 "nvme_io_md": false, 00:25:13.083 "write_zeroes": true, 00:25:13.083 "zcopy": true, 00:25:13.083 "get_zone_info": false, 00:25:13.083 "zone_management": false, 00:25:13.083 "zone_append": false, 00:25:13.083 "compare": false, 00:25:13.083 "compare_and_write": false, 00:25:13.083 "abort": true, 00:25:13.083 "seek_hole": false, 00:25:13.083 "seek_data": false, 00:25:13.083 "copy": true, 00:25:13.083 "nvme_iov_md": false 00:25:13.083 }, 00:25:13.083 "memory_domains": [ 00:25:13.083 { 00:25:13.083 "dma_device_id": "system", 00:25:13.083 "dma_device_type": 1 00:25:13.083 }, 00:25:13.083 { 00:25:13.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:13.083 "dma_device_type": 2 00:25:13.083 } 00:25:13.083 ], 00:25:13.083 "driver_specific": {} 00:25:13.083 }' 00:25:13.083 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:13.083 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:13.083 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:13.083 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:13.083 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:13.342 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:13.342 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:13.342 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:13.342 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:25:13.342 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:13.342 08:00:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:13.342 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:13.342 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:13.342 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:13.342 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:13.601 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:13.601 "name": "BaseBdev2", 00:25:13.601 "aliases": [ 00:25:13.601 "08dc4f7b-3421-44f7-be86-a6d4061f5881" 00:25:13.601 ], 00:25:13.601 "product_name": "Malloc disk", 00:25:13.601 "block_size": 4096, 00:25:13.601 "num_blocks": 8192, 00:25:13.601 "uuid": "08dc4f7b-3421-44f7-be86-a6d4061f5881", 00:25:13.601 "md_size": 32, 00:25:13.601 "md_interleave": false, 00:25:13.601 "dif_type": 0, 00:25:13.601 "assigned_rate_limits": { 00:25:13.601 "rw_ios_per_sec": 0, 00:25:13.601 "rw_mbytes_per_sec": 0, 00:25:13.601 "r_mbytes_per_sec": 0, 00:25:13.601 "w_mbytes_per_sec": 0 00:25:13.601 }, 00:25:13.601 "claimed": true, 00:25:13.601 "claim_type": "exclusive_write", 00:25:13.601 "zoned": false, 00:25:13.601 "supported_io_types": { 00:25:13.601 "read": true, 00:25:13.601 "write": true, 00:25:13.601 "unmap": true, 00:25:13.601 "flush": true, 00:25:13.601 "reset": true, 00:25:13.601 "nvme_admin": false, 00:25:13.601 "nvme_io": false, 00:25:13.601 "nvme_io_md": false, 00:25:13.601 "write_zeroes": true, 00:25:13.601 "zcopy": true, 00:25:13.601 "get_zone_info": false, 00:25:13.601 "zone_management": false, 00:25:13.601 "zone_append": false, 00:25:13.601 "compare": false, 00:25:13.601 "compare_and_write": false, 00:25:13.601 "abort": true, 00:25:13.601 "seek_hole": false, 00:25:13.601 "seek_data": false, 00:25:13.601 "copy": true, 00:25:13.601 "nvme_iov_md": false 00:25:13.601 }, 00:25:13.601 "memory_domains": [ 00:25:13.601 { 00:25:13.601 "dma_device_id": "system", 00:25:13.601 "dma_device_type": 1 00:25:13.601 }, 00:25:13.601 { 00:25:13.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:13.601 "dma_device_type": 2 00:25:13.601 } 00:25:13.601 ], 00:25:13.601 "driver_specific": {} 00:25:13.601 }' 00:25:13.601 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:13.601 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:13.601 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:13.601 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:13.601 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:13.860 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:13.860 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:13.860 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:13.860 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:25:13.860 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:13.860 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:13.860 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:13.860 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:14.120 [2024-07-15 08:00:58.737983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.120 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:14.379 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.379 "name": "Existed_Raid", 00:25:14.379 "uuid": "ad54ef31-3b20-48a3-a67e-f012dd2ca10f", 00:25:14.379 "strip_size_kb": 0, 00:25:14.379 "state": "online", 00:25:14.379 "raid_level": "raid1", 00:25:14.379 "superblock": true, 00:25:14.379 "num_base_bdevs": 2, 00:25:14.379 "num_base_bdevs_discovered": 1, 00:25:14.379 "num_base_bdevs_operational": 1, 00:25:14.379 "base_bdevs_list": [ 00:25:14.379 { 00:25:14.379 "name": null, 00:25:14.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.379 "is_configured": false, 00:25:14.379 "data_offset": 256, 00:25:14.379 "data_size": 7936 00:25:14.379 }, 00:25:14.379 { 00:25:14.379 "name": "BaseBdev2", 00:25:14.379 "uuid": "08dc4f7b-3421-44f7-be86-a6d4061f5881", 00:25:14.379 "is_configured": true, 00:25:14.379 "data_offset": 256, 00:25:14.379 "data_size": 7936 00:25:14.379 } 00:25:14.379 ] 00:25:14.379 }' 00:25:14.379 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.379 08:00:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:14.947 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:25:14.947 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:14.948 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.948 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:14.948 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:14.948 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:14.948 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:15.207 [2024-07-15 08:00:59.854678] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:15.207 [2024-07-15 08:00:59.854741] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:15.207 [2024-07-15 08:00:59.861182] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:15.207 [2024-07-15 08:00:59.861207] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:15.207 [2024-07-15 08:00:59.861213] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa25ed0 name Existed_Raid, state offline 00:25:15.207 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:15.207 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:15.207 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.207 08:00:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1751566 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1751566 ']' 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1751566 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1751566 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1751566' 00:25:15.465 killing process with pid 1751566 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1751566 00:25:15.465 [2024-07-15 08:01:00.117659] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:15.465 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1751566 00:25:15.465 [2024-07-15 08:01:00.118252] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:15.724 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:25:15.724 00:25:15.724 real 0m10.641s 00:25:15.724 user 0m19.498s 00:25:15.724 sys 0m1.509s 00:25:15.724 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:15.724 08:01:00 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:15.724 ************************************ 00:25:15.724 END TEST raid_state_function_test_sb_md_separate 00:25:15.724 ************************************ 00:25:15.724 08:01:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:15.724 08:01:00 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:25:15.724 08:01:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:25:15.724 08:01:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:15.724 08:01:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:15.724 ************************************ 00:25:15.724 START TEST raid_superblock_test_md_separate 00:25:15.724 ************************************ 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=1753609 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 1753609 /var/tmp/spdk-raid.sock 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1753609 ']' 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:15.724 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:15.724 08:01:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:15.724 [2024-07-15 08:01:00.381434] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:25:15.724 [2024-07-15 08:01:00.381483] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1753609 ] 00:25:15.724 [2024-07-15 08:01:00.469671] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:15.983 [2024-07-15 08:01:00.534038] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:15.983 [2024-07-15 08:01:00.585065] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:15.983 [2024-07-15 08:01:00.585090] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:16.551 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:25:16.809 malloc1 00:25:16.809 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:16.809 [2024-07-15 08:01:01.556449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:16.810 [2024-07-15 08:01:01.556481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:16.810 [2024-07-15 08:01:01.556492] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe9a3c0 00:25:16.810 [2024-07-15 08:01:01.556499] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:16.810 [2024-07-15 08:01:01.557668] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:16.810 [2024-07-15 08:01:01.557687] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:16.810 pt1 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:25:17.068 malloc2 00:25:17.068 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:17.327 [2024-07-15 08:01:01.935808] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:17.327 [2024-07-15 08:01:01.935835] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:17.327 [2024-07-15 08:01:01.935844] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1026af0 00:25:17.327 [2024-07-15 08:01:01.935850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:17.327 [2024-07-15 08:01:01.937017] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:17.327 [2024-07-15 08:01:01.937034] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:17.327 pt2 00:25:17.327 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:17.327 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:17.327 08:01:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:17.587 [2024-07-15 08:01:02.128309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:17.587 [2024-07-15 08:01:02.129316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:17.587 [2024-07-15 08:01:02.129428] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x101aee0 00:25:17.587 [2024-07-15 08:01:02.129436] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:17.587 [2024-07-15 08:01:02.129483] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x101ade0 00:25:17.587 [2024-07-15 08:01:02.129570] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x101aee0 00:25:17.587 [2024-07-15 08:01:02.129575] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x101aee0 00:25:17.587 [2024-07-15 08:01:02.129624] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:17.587 "name": "raid_bdev1", 00:25:17.587 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:17.587 "strip_size_kb": 0, 00:25:17.587 "state": "online", 00:25:17.587 "raid_level": "raid1", 00:25:17.587 "superblock": true, 00:25:17.587 "num_base_bdevs": 2, 00:25:17.587 "num_base_bdevs_discovered": 2, 00:25:17.587 "num_base_bdevs_operational": 2, 00:25:17.587 "base_bdevs_list": [ 00:25:17.587 { 00:25:17.587 "name": "pt1", 00:25:17.587 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:17.587 "is_configured": true, 00:25:17.587 "data_offset": 256, 00:25:17.587 "data_size": 7936 00:25:17.587 }, 00:25:17.587 { 00:25:17.587 "name": "pt2", 00:25:17.587 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:17.587 "is_configured": true, 00:25:17.587 "data_offset": 256, 00:25:17.587 "data_size": 7936 00:25:17.587 } 00:25:17.587 ] 00:25:17.587 }' 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:17.587 08:01:02 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:18.157 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:25:18.157 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:18.157 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:18.157 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:18.157 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:18.157 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:25:18.157 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:18.157 08:01:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:18.418 [2024-07-15 08:01:03.046858] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:18.418 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:18.418 "name": "raid_bdev1", 00:25:18.418 "aliases": [ 00:25:18.418 "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0" 00:25:18.418 ], 00:25:18.418 "product_name": "Raid Volume", 00:25:18.418 "block_size": 4096, 00:25:18.418 "num_blocks": 7936, 00:25:18.418 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:18.418 "md_size": 32, 00:25:18.418 "md_interleave": false, 00:25:18.418 "dif_type": 0, 00:25:18.418 "assigned_rate_limits": { 00:25:18.418 "rw_ios_per_sec": 0, 00:25:18.418 "rw_mbytes_per_sec": 0, 00:25:18.418 "r_mbytes_per_sec": 0, 00:25:18.418 "w_mbytes_per_sec": 0 00:25:18.418 }, 00:25:18.418 "claimed": false, 00:25:18.418 "zoned": false, 00:25:18.418 "supported_io_types": { 00:25:18.418 "read": true, 00:25:18.418 "write": true, 00:25:18.418 "unmap": false, 00:25:18.418 "flush": false, 00:25:18.418 "reset": true, 00:25:18.418 "nvme_admin": false, 00:25:18.418 "nvme_io": false, 00:25:18.418 "nvme_io_md": false, 00:25:18.418 "write_zeroes": true, 00:25:18.418 "zcopy": false, 00:25:18.418 "get_zone_info": false, 00:25:18.418 "zone_management": false, 00:25:18.418 "zone_append": false, 00:25:18.418 "compare": false, 00:25:18.418 "compare_and_write": false, 00:25:18.418 "abort": false, 00:25:18.418 "seek_hole": false, 00:25:18.418 "seek_data": false, 00:25:18.418 "copy": false, 00:25:18.418 "nvme_iov_md": false 00:25:18.418 }, 00:25:18.418 "memory_domains": [ 00:25:18.418 { 00:25:18.418 "dma_device_id": "system", 00:25:18.418 "dma_device_type": 1 00:25:18.418 }, 00:25:18.418 { 00:25:18.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:18.418 "dma_device_type": 2 00:25:18.418 }, 00:25:18.418 { 00:25:18.418 "dma_device_id": "system", 00:25:18.418 "dma_device_type": 1 00:25:18.418 }, 00:25:18.418 { 00:25:18.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:18.418 "dma_device_type": 2 00:25:18.418 } 00:25:18.418 ], 00:25:18.418 "driver_specific": { 00:25:18.418 "raid": { 00:25:18.418 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:18.418 "strip_size_kb": 0, 00:25:18.418 "state": "online", 00:25:18.418 "raid_level": "raid1", 00:25:18.418 "superblock": true, 00:25:18.418 "num_base_bdevs": 2, 00:25:18.418 "num_base_bdevs_discovered": 2, 00:25:18.418 "num_base_bdevs_operational": 2, 00:25:18.418 "base_bdevs_list": [ 00:25:18.418 { 00:25:18.418 "name": "pt1", 00:25:18.418 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:18.418 "is_configured": true, 00:25:18.418 "data_offset": 256, 00:25:18.418 "data_size": 7936 00:25:18.418 }, 00:25:18.418 { 00:25:18.418 "name": "pt2", 00:25:18.418 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:18.418 "is_configured": true, 00:25:18.418 "data_offset": 256, 00:25:18.418 "data_size": 7936 00:25:18.418 } 00:25:18.418 ] 00:25:18.418 } 00:25:18.418 } 00:25:18.418 }' 00:25:18.418 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:18.418 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:18.418 pt2' 00:25:18.418 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:18.418 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:18.418 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:18.678 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:18.678 "name": "pt1", 00:25:18.678 "aliases": [ 00:25:18.678 "00000000-0000-0000-0000-000000000001" 00:25:18.678 ], 00:25:18.678 "product_name": "passthru", 00:25:18.678 "block_size": 4096, 00:25:18.678 "num_blocks": 8192, 00:25:18.678 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:18.678 "md_size": 32, 00:25:18.678 "md_interleave": false, 00:25:18.678 "dif_type": 0, 00:25:18.678 "assigned_rate_limits": { 00:25:18.678 "rw_ios_per_sec": 0, 00:25:18.678 "rw_mbytes_per_sec": 0, 00:25:18.678 "r_mbytes_per_sec": 0, 00:25:18.678 "w_mbytes_per_sec": 0 00:25:18.678 }, 00:25:18.678 "claimed": true, 00:25:18.678 "claim_type": "exclusive_write", 00:25:18.678 "zoned": false, 00:25:18.678 "supported_io_types": { 00:25:18.678 "read": true, 00:25:18.678 "write": true, 00:25:18.678 "unmap": true, 00:25:18.678 "flush": true, 00:25:18.678 "reset": true, 00:25:18.678 "nvme_admin": false, 00:25:18.678 "nvme_io": false, 00:25:18.678 "nvme_io_md": false, 00:25:18.678 "write_zeroes": true, 00:25:18.678 "zcopy": true, 00:25:18.678 "get_zone_info": false, 00:25:18.678 "zone_management": false, 00:25:18.678 "zone_append": false, 00:25:18.678 "compare": false, 00:25:18.678 "compare_and_write": false, 00:25:18.678 "abort": true, 00:25:18.678 "seek_hole": false, 00:25:18.678 "seek_data": false, 00:25:18.678 "copy": true, 00:25:18.678 "nvme_iov_md": false 00:25:18.678 }, 00:25:18.678 "memory_domains": [ 00:25:18.678 { 00:25:18.678 "dma_device_id": "system", 00:25:18.678 "dma_device_type": 1 00:25:18.678 }, 00:25:18.678 { 00:25:18.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:18.678 "dma_device_type": 2 00:25:18.678 } 00:25:18.678 ], 00:25:18.678 "driver_specific": { 00:25:18.678 "passthru": { 00:25:18.678 "name": "pt1", 00:25:18.678 "base_bdev_name": "malloc1" 00:25:18.678 } 00:25:18.678 } 00:25:18.678 }' 00:25:18.678 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:18.678 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:18.678 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:18.678 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.678 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:18.937 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:19.197 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:19.197 "name": "pt2", 00:25:19.197 "aliases": [ 00:25:19.197 "00000000-0000-0000-0000-000000000002" 00:25:19.197 ], 00:25:19.197 "product_name": "passthru", 00:25:19.197 "block_size": 4096, 00:25:19.197 "num_blocks": 8192, 00:25:19.197 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:19.197 "md_size": 32, 00:25:19.197 "md_interleave": false, 00:25:19.197 "dif_type": 0, 00:25:19.197 "assigned_rate_limits": { 00:25:19.197 "rw_ios_per_sec": 0, 00:25:19.197 "rw_mbytes_per_sec": 0, 00:25:19.197 "r_mbytes_per_sec": 0, 00:25:19.197 "w_mbytes_per_sec": 0 00:25:19.197 }, 00:25:19.197 "claimed": true, 00:25:19.197 "claim_type": "exclusive_write", 00:25:19.197 "zoned": false, 00:25:19.197 "supported_io_types": { 00:25:19.197 "read": true, 00:25:19.197 "write": true, 00:25:19.197 "unmap": true, 00:25:19.197 "flush": true, 00:25:19.197 "reset": true, 00:25:19.197 "nvme_admin": false, 00:25:19.197 "nvme_io": false, 00:25:19.197 "nvme_io_md": false, 00:25:19.197 "write_zeroes": true, 00:25:19.197 "zcopy": true, 00:25:19.197 "get_zone_info": false, 00:25:19.197 "zone_management": false, 00:25:19.197 "zone_append": false, 00:25:19.197 "compare": false, 00:25:19.197 "compare_and_write": false, 00:25:19.197 "abort": true, 00:25:19.197 "seek_hole": false, 00:25:19.197 "seek_data": false, 00:25:19.197 "copy": true, 00:25:19.197 "nvme_iov_md": false 00:25:19.197 }, 00:25:19.197 "memory_domains": [ 00:25:19.197 { 00:25:19.197 "dma_device_id": "system", 00:25:19.197 "dma_device_type": 1 00:25:19.197 }, 00:25:19.197 { 00:25:19.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:19.197 "dma_device_type": 2 00:25:19.197 } 00:25:19.197 ], 00:25:19.197 "driver_specific": { 00:25:19.197 "passthru": { 00:25:19.197 "name": "pt2", 00:25:19.197 "base_bdev_name": "malloc2" 00:25:19.197 } 00:25:19.197 } 00:25:19.197 }' 00:25:19.197 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:19.197 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:19.197 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:19.197 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:19.456 08:01:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:19.456 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:25:19.716 [2024-07-15 08:01:04.382228] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:19.716 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0 00:25:19.716 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0 ']' 00:25:19.716 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:19.976 [2024-07-15 08:01:04.578510] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:19.976 [2024-07-15 08:01:04.578521] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:19.976 [2024-07-15 08:01:04.578555] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:19.976 [2024-07-15 08:01:04.578592] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:19.976 [2024-07-15 08:01:04.578598] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101aee0 name raid_bdev1, state offline 00:25:19.976 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.976 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:25:20.235 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:25:20.235 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:25:20.235 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:20.235 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:20.235 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:20.235 08:01:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:20.496 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:20.496 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:20.755 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:21.015 [2024-07-15 08:01:05.532887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:21.015 [2024-07-15 08:01:05.533940] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:21.015 [2024-07-15 08:01:05.533981] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:21.015 [2024-07-15 08:01:05.534006] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:21.015 [2024-07-15 08:01:05.534016] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:21.015 [2024-07-15 08:01:05.534022] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe99d60 name raid_bdev1, state configuring 00:25:21.015 request: 00:25:21.015 { 00:25:21.015 "name": "raid_bdev1", 00:25:21.015 "raid_level": "raid1", 00:25:21.015 "base_bdevs": [ 00:25:21.015 "malloc1", 00:25:21.015 "malloc2" 00:25:21.015 ], 00:25:21.015 "superblock": false, 00:25:21.015 "method": "bdev_raid_create", 00:25:21.015 "req_id": 1 00:25:21.015 } 00:25:21.015 Got JSON-RPC error response 00:25:21.015 response: 00:25:21.015 { 00:25:21.015 "code": -17, 00:25:21.015 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:21.015 } 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:25:21.015 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:21.275 [2024-07-15 08:01:05.901780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:21.275 [2024-07-15 08:01:05.901800] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.275 [2024-07-15 08:01:05.901810] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x101d150 00:25:21.275 [2024-07-15 08:01:05.901816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.275 [2024-07-15 08:01:05.902915] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.275 [2024-07-15 08:01:05.902931] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:21.275 [2024-07-15 08:01:05.902959] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:21.275 [2024-07-15 08:01:05.902976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:21.275 pt1 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.275 08:01:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.534 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:21.534 "name": "raid_bdev1", 00:25:21.534 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:21.534 "strip_size_kb": 0, 00:25:21.534 "state": "configuring", 00:25:21.534 "raid_level": "raid1", 00:25:21.534 "superblock": true, 00:25:21.534 "num_base_bdevs": 2, 00:25:21.534 "num_base_bdevs_discovered": 1, 00:25:21.534 "num_base_bdevs_operational": 2, 00:25:21.534 "base_bdevs_list": [ 00:25:21.534 { 00:25:21.534 "name": "pt1", 00:25:21.534 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:21.534 "is_configured": true, 00:25:21.534 "data_offset": 256, 00:25:21.534 "data_size": 7936 00:25:21.534 }, 00:25:21.534 { 00:25:21.534 "name": null, 00:25:21.534 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:21.534 "is_configured": false, 00:25:21.534 "data_offset": 256, 00:25:21.534 "data_size": 7936 00:25:21.534 } 00:25:21.534 ] 00:25:21.534 }' 00:25:21.534 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:21.534 08:01:06 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:22.102 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:25:22.102 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:25:22.102 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:22.102 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:22.102 [2024-07-15 08:01:06.784016] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:22.102 [2024-07-15 08:01:06.784044] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.102 [2024-07-15 08:01:06.784055] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x101d7a0 00:25:22.102 [2024-07-15 08:01:06.784061] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.102 [2024-07-15 08:01:06.784191] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.102 [2024-07-15 08:01:06.784200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:22.103 [2024-07-15 08:01:06.784226] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:22.103 [2024-07-15 08:01:06.784237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:22.103 [2024-07-15 08:01:06.784305] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x101ddf0 00:25:22.103 [2024-07-15 08:01:06.784311] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:22.103 [2024-07-15 08:01:06.784349] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x101fbe0 00:25:22.103 [2024-07-15 08:01:06.784427] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x101ddf0 00:25:22.103 [2024-07-15 08:01:06.784432] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x101ddf0 00:25:22.103 [2024-07-15 08:01:06.784480] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:22.103 pt2 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.103 08:01:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.362 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:22.362 "name": "raid_bdev1", 00:25:22.362 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:22.362 "strip_size_kb": 0, 00:25:22.362 "state": "online", 00:25:22.362 "raid_level": "raid1", 00:25:22.362 "superblock": true, 00:25:22.362 "num_base_bdevs": 2, 00:25:22.362 "num_base_bdevs_discovered": 2, 00:25:22.362 "num_base_bdevs_operational": 2, 00:25:22.362 "base_bdevs_list": [ 00:25:22.362 { 00:25:22.362 "name": "pt1", 00:25:22.362 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:22.362 "is_configured": true, 00:25:22.362 "data_offset": 256, 00:25:22.362 "data_size": 7936 00:25:22.362 }, 00:25:22.362 { 00:25:22.362 "name": "pt2", 00:25:22.362 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:22.362 "is_configured": true, 00:25:22.362 "data_offset": 256, 00:25:22.362 "data_size": 7936 00:25:22.362 } 00:25:22.362 ] 00:25:22.362 }' 00:25:22.362 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:22.362 08:01:07 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:22.931 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:25:22.931 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:22.931 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:22.931 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:22.931 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:22.931 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:25:22.931 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:22.931 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:23.192 [2024-07-15 08:01:07.734621] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:23.192 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:23.192 "name": "raid_bdev1", 00:25:23.192 "aliases": [ 00:25:23.192 "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0" 00:25:23.192 ], 00:25:23.192 "product_name": "Raid Volume", 00:25:23.192 "block_size": 4096, 00:25:23.192 "num_blocks": 7936, 00:25:23.192 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:23.192 "md_size": 32, 00:25:23.192 "md_interleave": false, 00:25:23.192 "dif_type": 0, 00:25:23.192 "assigned_rate_limits": { 00:25:23.192 "rw_ios_per_sec": 0, 00:25:23.192 "rw_mbytes_per_sec": 0, 00:25:23.192 "r_mbytes_per_sec": 0, 00:25:23.192 "w_mbytes_per_sec": 0 00:25:23.192 }, 00:25:23.192 "claimed": false, 00:25:23.192 "zoned": false, 00:25:23.192 "supported_io_types": { 00:25:23.192 "read": true, 00:25:23.192 "write": true, 00:25:23.192 "unmap": false, 00:25:23.192 "flush": false, 00:25:23.192 "reset": true, 00:25:23.192 "nvme_admin": false, 00:25:23.192 "nvme_io": false, 00:25:23.192 "nvme_io_md": false, 00:25:23.192 "write_zeroes": true, 00:25:23.192 "zcopy": false, 00:25:23.192 "get_zone_info": false, 00:25:23.192 "zone_management": false, 00:25:23.192 "zone_append": false, 00:25:23.192 "compare": false, 00:25:23.192 "compare_and_write": false, 00:25:23.192 "abort": false, 00:25:23.192 "seek_hole": false, 00:25:23.192 "seek_data": false, 00:25:23.192 "copy": false, 00:25:23.192 "nvme_iov_md": false 00:25:23.192 }, 00:25:23.192 "memory_domains": [ 00:25:23.192 { 00:25:23.192 "dma_device_id": "system", 00:25:23.192 "dma_device_type": 1 00:25:23.192 }, 00:25:23.192 { 00:25:23.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:23.192 "dma_device_type": 2 00:25:23.192 }, 00:25:23.192 { 00:25:23.192 "dma_device_id": "system", 00:25:23.192 "dma_device_type": 1 00:25:23.192 }, 00:25:23.192 { 00:25:23.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:23.192 "dma_device_type": 2 00:25:23.192 } 00:25:23.192 ], 00:25:23.192 "driver_specific": { 00:25:23.192 "raid": { 00:25:23.192 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:23.192 "strip_size_kb": 0, 00:25:23.192 "state": "online", 00:25:23.192 "raid_level": "raid1", 00:25:23.192 "superblock": true, 00:25:23.192 "num_base_bdevs": 2, 00:25:23.192 "num_base_bdevs_discovered": 2, 00:25:23.192 "num_base_bdevs_operational": 2, 00:25:23.192 "base_bdevs_list": [ 00:25:23.192 { 00:25:23.192 "name": "pt1", 00:25:23.192 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:23.192 "is_configured": true, 00:25:23.192 "data_offset": 256, 00:25:23.192 "data_size": 7936 00:25:23.192 }, 00:25:23.192 { 00:25:23.192 "name": "pt2", 00:25:23.192 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:23.192 "is_configured": true, 00:25:23.192 "data_offset": 256, 00:25:23.192 "data_size": 7936 00:25:23.192 } 00:25:23.192 ] 00:25:23.192 } 00:25:23.192 } 00:25:23.192 }' 00:25:23.192 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:23.192 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:23.192 pt2' 00:25:23.192 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:23.192 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:23.192 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:23.451 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:23.451 "name": "pt1", 00:25:23.451 "aliases": [ 00:25:23.451 "00000000-0000-0000-0000-000000000001" 00:25:23.451 ], 00:25:23.451 "product_name": "passthru", 00:25:23.451 "block_size": 4096, 00:25:23.451 "num_blocks": 8192, 00:25:23.452 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:23.452 "md_size": 32, 00:25:23.452 "md_interleave": false, 00:25:23.452 "dif_type": 0, 00:25:23.452 "assigned_rate_limits": { 00:25:23.452 "rw_ios_per_sec": 0, 00:25:23.452 "rw_mbytes_per_sec": 0, 00:25:23.452 "r_mbytes_per_sec": 0, 00:25:23.452 "w_mbytes_per_sec": 0 00:25:23.452 }, 00:25:23.452 "claimed": true, 00:25:23.452 "claim_type": "exclusive_write", 00:25:23.452 "zoned": false, 00:25:23.452 "supported_io_types": { 00:25:23.452 "read": true, 00:25:23.452 "write": true, 00:25:23.452 "unmap": true, 00:25:23.452 "flush": true, 00:25:23.452 "reset": true, 00:25:23.452 "nvme_admin": false, 00:25:23.452 "nvme_io": false, 00:25:23.452 "nvme_io_md": false, 00:25:23.452 "write_zeroes": true, 00:25:23.452 "zcopy": true, 00:25:23.452 "get_zone_info": false, 00:25:23.452 "zone_management": false, 00:25:23.452 "zone_append": false, 00:25:23.452 "compare": false, 00:25:23.452 "compare_and_write": false, 00:25:23.452 "abort": true, 00:25:23.452 "seek_hole": false, 00:25:23.452 "seek_data": false, 00:25:23.452 "copy": true, 00:25:23.452 "nvme_iov_md": false 00:25:23.452 }, 00:25:23.452 "memory_domains": [ 00:25:23.452 { 00:25:23.452 "dma_device_id": "system", 00:25:23.452 "dma_device_type": 1 00:25:23.452 }, 00:25:23.452 { 00:25:23.452 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:23.452 "dma_device_type": 2 00:25:23.452 } 00:25:23.452 ], 00:25:23.452 "driver_specific": { 00:25:23.452 "passthru": { 00:25:23.452 "name": "pt1", 00:25:23.452 "base_bdev_name": "malloc1" 00:25:23.452 } 00:25:23.452 } 00:25:23.452 }' 00:25:23.452 08:01:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:23.452 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:23.452 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:23.452 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:23.452 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:23.452 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:23.452 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:23.452 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:23.731 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:25:23.731 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:23.731 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:23.731 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:23.731 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:23.731 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:23.731 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:23.990 "name": "pt2", 00:25:23.990 "aliases": [ 00:25:23.990 "00000000-0000-0000-0000-000000000002" 00:25:23.990 ], 00:25:23.990 "product_name": "passthru", 00:25:23.990 "block_size": 4096, 00:25:23.990 "num_blocks": 8192, 00:25:23.990 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:23.990 "md_size": 32, 00:25:23.990 "md_interleave": false, 00:25:23.990 "dif_type": 0, 00:25:23.990 "assigned_rate_limits": { 00:25:23.990 "rw_ios_per_sec": 0, 00:25:23.990 "rw_mbytes_per_sec": 0, 00:25:23.990 "r_mbytes_per_sec": 0, 00:25:23.990 "w_mbytes_per_sec": 0 00:25:23.990 }, 00:25:23.990 "claimed": true, 00:25:23.990 "claim_type": "exclusive_write", 00:25:23.990 "zoned": false, 00:25:23.990 "supported_io_types": { 00:25:23.990 "read": true, 00:25:23.990 "write": true, 00:25:23.990 "unmap": true, 00:25:23.990 "flush": true, 00:25:23.990 "reset": true, 00:25:23.990 "nvme_admin": false, 00:25:23.990 "nvme_io": false, 00:25:23.990 "nvme_io_md": false, 00:25:23.990 "write_zeroes": true, 00:25:23.990 "zcopy": true, 00:25:23.990 "get_zone_info": false, 00:25:23.990 "zone_management": false, 00:25:23.990 "zone_append": false, 00:25:23.990 "compare": false, 00:25:23.990 "compare_and_write": false, 00:25:23.990 "abort": true, 00:25:23.990 "seek_hole": false, 00:25:23.990 "seek_data": false, 00:25:23.990 "copy": true, 00:25:23.990 "nvme_iov_md": false 00:25:23.990 }, 00:25:23.990 "memory_domains": [ 00:25:23.990 { 00:25:23.990 "dma_device_id": "system", 00:25:23.990 "dma_device_type": 1 00:25:23.990 }, 00:25:23.990 { 00:25:23.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:23.990 "dma_device_type": 2 00:25:23.990 } 00:25:23.990 ], 00:25:23.990 "driver_specific": { 00:25:23.990 "passthru": { 00:25:23.990 "name": "pt2", 00:25:23.990 "base_bdev_name": "malloc2" 00:25:23.990 } 00:25:23.990 } 00:25:23.990 }' 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:23.990 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:25:24.250 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:24.250 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:24.250 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:25:24.250 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:24.250 08:01:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:25:24.509 [2024-07-15 08:01:09.013850] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0 '!=' a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0 ']' 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:24.509 [2024-07-15 08:01:09.206140] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.509 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.769 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:24.769 "name": "raid_bdev1", 00:25:24.769 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:24.769 "strip_size_kb": 0, 00:25:24.769 "state": "online", 00:25:24.769 "raid_level": "raid1", 00:25:24.769 "superblock": true, 00:25:24.769 "num_base_bdevs": 2, 00:25:24.769 "num_base_bdevs_discovered": 1, 00:25:24.769 "num_base_bdevs_operational": 1, 00:25:24.769 "base_bdevs_list": [ 00:25:24.769 { 00:25:24.769 "name": null, 00:25:24.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:24.769 "is_configured": false, 00:25:24.769 "data_offset": 256, 00:25:24.769 "data_size": 7936 00:25:24.769 }, 00:25:24.769 { 00:25:24.769 "name": "pt2", 00:25:24.769 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:24.769 "is_configured": true, 00:25:24.769 "data_offset": 256, 00:25:24.769 "data_size": 7936 00:25:24.769 } 00:25:24.769 ] 00:25:24.769 }' 00:25:24.769 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:24.769 08:01:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:25.337 08:01:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:25.596 [2024-07-15 08:01:10.132473] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:25.596 [2024-07-15 08:01:10.132492] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:25.596 [2024-07-15 08:01:10.132527] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:25.596 [2024-07-15 08:01:10.132555] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:25.596 [2024-07-15 08:01:10.132561] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101ddf0 name raid_bdev1, state offline 00:25:25.596 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.596 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:25:25.596 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:25:25.596 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:25:25.596 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:25:25.596 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:25.596 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:25.855 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:25:25.855 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:25.855 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:25:25.855 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:25:25.855 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:25:25.855 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:26.115 [2024-07-15 08:01:10.709913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:26.115 [2024-07-15 08:01:10.709941] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:26.115 [2024-07-15 08:01:10.709952] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe9a5f0 00:25:26.115 [2024-07-15 08:01:10.709959] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:26.115 [2024-07-15 08:01:10.711110] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:26.115 [2024-07-15 08:01:10.711128] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:26.115 [2024-07-15 08:01:10.711159] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:26.115 [2024-07-15 08:01:10.711177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:26.115 [2024-07-15 08:01:10.711234] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x101f7a0 00:25:26.115 [2024-07-15 08:01:10.711240] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:26.115 [2024-07-15 08:01:10.711280] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x101eb30 00:25:26.115 [2024-07-15 08:01:10.711354] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x101f7a0 00:25:26.115 [2024-07-15 08:01:10.711359] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x101f7a0 00:25:26.115 [2024-07-15 08:01:10.711407] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:26.115 pt2 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.115 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.374 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.374 "name": "raid_bdev1", 00:25:26.374 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:26.374 "strip_size_kb": 0, 00:25:26.374 "state": "online", 00:25:26.374 "raid_level": "raid1", 00:25:26.374 "superblock": true, 00:25:26.374 "num_base_bdevs": 2, 00:25:26.374 "num_base_bdevs_discovered": 1, 00:25:26.374 "num_base_bdevs_operational": 1, 00:25:26.374 "base_bdevs_list": [ 00:25:26.374 { 00:25:26.374 "name": null, 00:25:26.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.374 "is_configured": false, 00:25:26.374 "data_offset": 256, 00:25:26.374 "data_size": 7936 00:25:26.374 }, 00:25:26.374 { 00:25:26.374 "name": "pt2", 00:25:26.374 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:26.374 "is_configured": true, 00:25:26.374 "data_offset": 256, 00:25:26.374 "data_size": 7936 00:25:26.374 } 00:25:26.374 ] 00:25:26.374 }' 00:25:26.374 08:01:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.374 08:01:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:26.942 08:01:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:26.942 [2024-07-15 08:01:11.636231] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:26.942 [2024-07-15 08:01:11.636246] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:26.942 [2024-07-15 08:01:11.636277] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:26.942 [2024-07-15 08:01:11.636305] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:26.942 [2024-07-15 08:01:11.636311] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101f7a0 name raid_bdev1, state offline 00:25:26.942 08:01:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.942 08:01:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:25:27.202 08:01:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:25:27.202 08:01:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:25:27.202 08:01:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:25:27.202 08:01:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:27.461 [2024-07-15 08:01:12.021195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:27.461 [2024-07-15 08:01:12.021220] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:27.461 [2024-07-15 08:01:12.021230] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x101eb80 00:25:27.461 [2024-07-15 08:01:12.021236] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:27.461 [2024-07-15 08:01:12.022369] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:27.461 [2024-07-15 08:01:12.022388] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:27.461 [2024-07-15 08:01:12.022418] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:27.461 [2024-07-15 08:01:12.022436] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:27.461 [2024-07-15 08:01:12.022504] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:25:27.461 [2024-07-15 08:01:12.022512] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:27.461 [2024-07-15 08:01:12.022520] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101e6d0 name raid_bdev1, state configuring 00:25:27.461 [2024-07-15 08:01:12.022534] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:27.461 [2024-07-15 08:01:12.022571] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x101e070 00:25:27.461 [2024-07-15 08:01:12.022577] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:27.461 [2024-07-15 08:01:12.022621] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x101e870 00:25:27.461 [2024-07-15 08:01:12.022696] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x101e070 00:25:27.461 [2024-07-15 08:01:12.022701] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x101e070 00:25:27.461 [2024-07-15 08:01:12.022761] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:27.461 pt1 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.461 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.722 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:27.722 "name": "raid_bdev1", 00:25:27.722 "uuid": "a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0", 00:25:27.722 "strip_size_kb": 0, 00:25:27.722 "state": "online", 00:25:27.722 "raid_level": "raid1", 00:25:27.722 "superblock": true, 00:25:27.722 "num_base_bdevs": 2, 00:25:27.722 "num_base_bdevs_discovered": 1, 00:25:27.722 "num_base_bdevs_operational": 1, 00:25:27.722 "base_bdevs_list": [ 00:25:27.722 { 00:25:27.722 "name": null, 00:25:27.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.722 "is_configured": false, 00:25:27.722 "data_offset": 256, 00:25:27.722 "data_size": 7936 00:25:27.722 }, 00:25:27.722 { 00:25:27.722 "name": "pt2", 00:25:27.722 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:27.722 "is_configured": true, 00:25:27.722 "data_offset": 256, 00:25:27.722 "data_size": 7936 00:25:27.722 } 00:25:27.722 ] 00:25:27.722 }' 00:25:27.722 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:27.722 08:01:12 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:28.293 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:25:28.293 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:25:28.293 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:25:28.293 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:28.293 08:01:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:25:28.553 [2024-07-15 08:01:13.120148] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0 '!=' a7170f61-7b9f-4b7b-b8d8-baa9b4943ce0 ']' 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 1753609 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1753609 ']' 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 1753609 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1753609 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1753609' 00:25:28.553 killing process with pid 1753609 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 1753609 00:25:28.553 [2024-07-15 08:01:13.186896] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:28.553 [2024-07-15 08:01:13.186931] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:28.553 [2024-07-15 08:01:13.186959] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:28.553 [2024-07-15 08:01:13.186965] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101e070 name raid_bdev1, state offline 00:25:28.553 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 1753609 00:25:28.553 [2024-07-15 08:01:13.199473] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:28.813 08:01:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:25:28.813 00:25:28.813 real 0m13.007s 00:25:28.813 user 0m24.123s 00:25:28.813 sys 0m1.962s 00:25:28.813 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:28.813 08:01:13 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:28.813 ************************************ 00:25:28.813 END TEST raid_superblock_test_md_separate 00:25:28.813 ************************************ 00:25:28.813 08:01:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:28.813 08:01:13 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:25:28.813 08:01:13 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:25:28.813 08:01:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:28.813 08:01:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:28.813 08:01:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:28.813 ************************************ 00:25:28.813 START TEST raid_rebuild_test_sb_md_separate 00:25:28.813 ************************************ 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=1756065 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 1756065 /var/tmp/spdk-raid.sock 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1756065 ']' 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:28.813 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:28.813 08:01:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:28.813 [2024-07-15 08:01:13.462368] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:25:28.813 [2024-07-15 08:01:13.462421] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1756065 ] 00:25:28.813 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:28.813 Zero copy mechanism will not be used. 00:25:28.813 [2024-07-15 08:01:13.551305] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:29.072 [2024-07-15 08:01:13.618927] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.072 [2024-07-15 08:01:13.661898] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:29.072 [2024-07-15 08:01:13.661922] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:29.641 08:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:29.641 08:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:25:29.641 08:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:29.641 08:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:25:29.900 BaseBdev1_malloc 00:25:29.900 08:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:30.160 [2024-07-15 08:01:14.672634] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:30.160 [2024-07-15 08:01:14.672665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.160 [2024-07-15 08:01:14.672679] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10c26c0 00:25:30.160 [2024-07-15 08:01:14.672686] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.160 [2024-07-15 08:01:14.673861] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.160 [2024-07-15 08:01:14.673879] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:30.160 BaseBdev1 00:25:30.160 08:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:30.160 08:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:25:30.160 BaseBdev2_malloc 00:25:30.160 08:01:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:30.420 [2024-07-15 08:01:15.056091] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:30.420 [2024-07-15 08:01:15.056119] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.420 [2024-07-15 08:01:15.056130] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x124fc40 00:25:30.420 [2024-07-15 08:01:15.056136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.420 [2024-07-15 08:01:15.057237] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.420 [2024-07-15 08:01:15.057255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:30.420 BaseBdev2 00:25:30.420 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:25:30.679 spare_malloc 00:25:30.679 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:30.939 spare_delay 00:25:30.939 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:30.939 [2024-07-15 08:01:15.615886] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:30.939 [2024-07-15 08:01:15.615911] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.939 [2024-07-15 08:01:15.615923] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1244cf0 00:25:30.939 [2024-07-15 08:01:15.615929] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.939 [2024-07-15 08:01:15.616993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.939 [2024-07-15 08:01:15.617011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:30.939 spare 00:25:30.939 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:31.199 [2024-07-15 08:01:15.804381] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:31.199 [2024-07-15 08:01:15.805371] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:31.199 [2024-07-15 08:01:15.805486] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x124e4c0 00:25:31.199 [2024-07-15 08:01:15.805494] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:31.199 [2024-07-15 08:01:15.805544] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10b8960 00:25:31.199 [2024-07-15 08:01:15.805628] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x124e4c0 00:25:31.199 [2024-07-15 08:01:15.805634] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x124e4c0 00:25:31.199 [2024-07-15 08:01:15.805682] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.199 08:01:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.458 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:31.458 "name": "raid_bdev1", 00:25:31.458 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:31.458 "strip_size_kb": 0, 00:25:31.458 "state": "online", 00:25:31.458 "raid_level": "raid1", 00:25:31.458 "superblock": true, 00:25:31.458 "num_base_bdevs": 2, 00:25:31.458 "num_base_bdevs_discovered": 2, 00:25:31.458 "num_base_bdevs_operational": 2, 00:25:31.458 "base_bdevs_list": [ 00:25:31.458 { 00:25:31.458 "name": "BaseBdev1", 00:25:31.458 "uuid": "3bd3ff2e-4270-567d-83e6-6a1655cce122", 00:25:31.458 "is_configured": true, 00:25:31.458 "data_offset": 256, 00:25:31.458 "data_size": 7936 00:25:31.458 }, 00:25:31.458 { 00:25:31.458 "name": "BaseBdev2", 00:25:31.458 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:31.458 "is_configured": true, 00:25:31.458 "data_offset": 256, 00:25:31.458 "data_size": 7936 00:25:31.458 } 00:25:31.458 ] 00:25:31.458 }' 00:25:31.458 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:31.458 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:32.028 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:32.028 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:32.028 [2024-07-15 08:01:16.726895] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:32.028 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:25:32.028 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.028 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:32.287 08:01:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:32.547 [2024-07-15 08:01:17.115701] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10b9f90 00:25:32.547 /dev/nbd0 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:32.547 1+0 records in 00:25:32.547 1+0 records out 00:25:32.547 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305381 s, 13.4 MB/s 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:32.547 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:25:33.117 7936+0 records in 00:25:33.117 7936+0 records out 00:25:33.117 32505856 bytes (33 MB, 31 MiB) copied, 0.685131 s, 47.4 MB/s 00:25:33.117 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:33.117 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:33.117 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:33.117 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:33.117 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:25:33.117 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:33.117 08:01:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:33.376 [2024-07-15 08:01:18.055028] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:25:33.376 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:33.636 [2024-07-15 08:01:18.219884] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.636 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.895 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.895 "name": "raid_bdev1", 00:25:33.895 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:33.895 "strip_size_kb": 0, 00:25:33.896 "state": "online", 00:25:33.896 "raid_level": "raid1", 00:25:33.896 "superblock": true, 00:25:33.896 "num_base_bdevs": 2, 00:25:33.896 "num_base_bdevs_discovered": 1, 00:25:33.896 "num_base_bdevs_operational": 1, 00:25:33.896 "base_bdevs_list": [ 00:25:33.896 { 00:25:33.896 "name": null, 00:25:33.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.896 "is_configured": false, 00:25:33.896 "data_offset": 256, 00:25:33.896 "data_size": 7936 00:25:33.896 }, 00:25:33.896 { 00:25:33.896 "name": "BaseBdev2", 00:25:33.896 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:33.896 "is_configured": true, 00:25:33.896 "data_offset": 256, 00:25:33.896 "data_size": 7936 00:25:33.896 } 00:25:33.896 ] 00:25:33.896 }' 00:25:33.896 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.896 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:34.466 08:01:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:34.466 [2024-07-15 08:01:19.134252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.466 [2024-07-15 08:01:19.135868] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12502c0 00:25:34.466 [2024-07-15 08:01:19.137422] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:34.466 08:01:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:35.404 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.404 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.404 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.404 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.404 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.663 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.663 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.663 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.663 "name": "raid_bdev1", 00:25:35.663 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:35.663 "strip_size_kb": 0, 00:25:35.663 "state": "online", 00:25:35.663 "raid_level": "raid1", 00:25:35.663 "superblock": true, 00:25:35.663 "num_base_bdevs": 2, 00:25:35.663 "num_base_bdevs_discovered": 2, 00:25:35.663 "num_base_bdevs_operational": 2, 00:25:35.663 "process": { 00:25:35.663 "type": "rebuild", 00:25:35.663 "target": "spare", 00:25:35.663 "progress": { 00:25:35.663 "blocks": 2816, 00:25:35.663 "percent": 35 00:25:35.663 } 00:25:35.663 }, 00:25:35.663 "base_bdevs_list": [ 00:25:35.663 { 00:25:35.663 "name": "spare", 00:25:35.663 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:35.663 "is_configured": true, 00:25:35.663 "data_offset": 256, 00:25:35.663 "data_size": 7936 00:25:35.663 }, 00:25:35.663 { 00:25:35.663 "name": "BaseBdev2", 00:25:35.663 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:35.663 "is_configured": true, 00:25:35.663 "data_offset": 256, 00:25:35.663 "data_size": 7936 00:25:35.663 } 00:25:35.663 ] 00:25:35.663 }' 00:25:35.663 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.663 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:35.663 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:35.923 [2024-07-15 08:01:20.618615] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:35.923 [2024-07-15 08:01:20.646301] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:35.923 [2024-07-15 08:01:20.646335] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:35.923 [2024-07-15 08:01:20.646345] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:35.923 [2024-07-15 08:01:20.646349] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.923 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.183 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.183 "name": "raid_bdev1", 00:25:36.183 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:36.183 "strip_size_kb": 0, 00:25:36.183 "state": "online", 00:25:36.183 "raid_level": "raid1", 00:25:36.183 "superblock": true, 00:25:36.183 "num_base_bdevs": 2, 00:25:36.183 "num_base_bdevs_discovered": 1, 00:25:36.183 "num_base_bdevs_operational": 1, 00:25:36.183 "base_bdevs_list": [ 00:25:36.183 { 00:25:36.183 "name": null, 00:25:36.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.183 "is_configured": false, 00:25:36.183 "data_offset": 256, 00:25:36.183 "data_size": 7936 00:25:36.183 }, 00:25:36.183 { 00:25:36.183 "name": "BaseBdev2", 00:25:36.183 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:36.183 "is_configured": true, 00:25:36.183 "data_offset": 256, 00:25:36.183 "data_size": 7936 00:25:36.183 } 00:25:36.183 ] 00:25:36.183 }' 00:25:36.183 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.183 08:01:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:36.754 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:36.754 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:36.754 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:36.754 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:36.754 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:36.754 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.754 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.013 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:37.013 "name": "raid_bdev1", 00:25:37.013 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:37.013 "strip_size_kb": 0, 00:25:37.013 "state": "online", 00:25:37.013 "raid_level": "raid1", 00:25:37.013 "superblock": true, 00:25:37.013 "num_base_bdevs": 2, 00:25:37.013 "num_base_bdevs_discovered": 1, 00:25:37.013 "num_base_bdevs_operational": 1, 00:25:37.013 "base_bdevs_list": [ 00:25:37.013 { 00:25:37.013 "name": null, 00:25:37.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.013 "is_configured": false, 00:25:37.013 "data_offset": 256, 00:25:37.013 "data_size": 7936 00:25:37.013 }, 00:25:37.013 { 00:25:37.013 "name": "BaseBdev2", 00:25:37.013 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:37.013 "is_configured": true, 00:25:37.013 "data_offset": 256, 00:25:37.013 "data_size": 7936 00:25:37.013 } 00:25:37.013 ] 00:25:37.013 }' 00:25:37.013 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:37.013 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:37.013 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:37.013 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:37.013 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:37.273 [2024-07-15 08:01:21.879383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:37.273 [2024-07-15 08:01:21.880976] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10c1650 00:25:37.273 [2024-07-15 08:01:21.882099] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:37.273 08:01:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:38.211 08:01:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:38.211 08:01:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:38.211 08:01:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:38.211 08:01:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:38.211 08:01:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:38.211 08:01:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.211 08:01:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.472 "name": "raid_bdev1", 00:25:38.472 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:38.472 "strip_size_kb": 0, 00:25:38.472 "state": "online", 00:25:38.472 "raid_level": "raid1", 00:25:38.472 "superblock": true, 00:25:38.472 "num_base_bdevs": 2, 00:25:38.472 "num_base_bdevs_discovered": 2, 00:25:38.472 "num_base_bdevs_operational": 2, 00:25:38.472 "process": { 00:25:38.472 "type": "rebuild", 00:25:38.472 "target": "spare", 00:25:38.472 "progress": { 00:25:38.472 "blocks": 2816, 00:25:38.472 "percent": 35 00:25:38.472 } 00:25:38.472 }, 00:25:38.472 "base_bdevs_list": [ 00:25:38.472 { 00:25:38.472 "name": "spare", 00:25:38.472 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:38.472 "is_configured": true, 00:25:38.472 "data_offset": 256, 00:25:38.472 "data_size": 7936 00:25:38.472 }, 00:25:38.472 { 00:25:38.472 "name": "BaseBdev2", 00:25:38.472 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:38.472 "is_configured": true, 00:25:38.472 "data_offset": 256, 00:25:38.472 "data_size": 7936 00:25:38.472 } 00:25:38.472 ] 00:25:38.472 }' 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:38.472 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=949 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.472 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.732 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.732 "name": "raid_bdev1", 00:25:38.732 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:38.732 "strip_size_kb": 0, 00:25:38.732 "state": "online", 00:25:38.732 "raid_level": "raid1", 00:25:38.732 "superblock": true, 00:25:38.732 "num_base_bdevs": 2, 00:25:38.732 "num_base_bdevs_discovered": 2, 00:25:38.732 "num_base_bdevs_operational": 2, 00:25:38.732 "process": { 00:25:38.732 "type": "rebuild", 00:25:38.732 "target": "spare", 00:25:38.732 "progress": { 00:25:38.732 "blocks": 3584, 00:25:38.732 "percent": 45 00:25:38.732 } 00:25:38.732 }, 00:25:38.732 "base_bdevs_list": [ 00:25:38.732 { 00:25:38.732 "name": "spare", 00:25:38.732 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:38.732 "is_configured": true, 00:25:38.732 "data_offset": 256, 00:25:38.732 "data_size": 7936 00:25:38.732 }, 00:25:38.732 { 00:25:38.732 "name": "BaseBdev2", 00:25:38.732 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:38.732 "is_configured": true, 00:25:38.732 "data_offset": 256, 00:25:38.732 "data_size": 7936 00:25:38.732 } 00:25:38.732 ] 00:25:38.732 }' 00:25:38.732 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.732 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:38.732 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.732 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:38.732 08:01:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:40.112 "name": "raid_bdev1", 00:25:40.112 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:40.112 "strip_size_kb": 0, 00:25:40.112 "state": "online", 00:25:40.112 "raid_level": "raid1", 00:25:40.112 "superblock": true, 00:25:40.112 "num_base_bdevs": 2, 00:25:40.112 "num_base_bdevs_discovered": 2, 00:25:40.112 "num_base_bdevs_operational": 2, 00:25:40.112 "process": { 00:25:40.112 "type": "rebuild", 00:25:40.112 "target": "spare", 00:25:40.112 "progress": { 00:25:40.112 "blocks": 6912, 00:25:40.112 "percent": 87 00:25:40.112 } 00:25:40.112 }, 00:25:40.112 "base_bdevs_list": [ 00:25:40.112 { 00:25:40.112 "name": "spare", 00:25:40.112 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:40.112 "is_configured": true, 00:25:40.112 "data_offset": 256, 00:25:40.112 "data_size": 7936 00:25:40.112 }, 00:25:40.112 { 00:25:40.112 "name": "BaseBdev2", 00:25:40.112 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:40.112 "is_configured": true, 00:25:40.112 "data_offset": 256, 00:25:40.112 "data_size": 7936 00:25:40.112 } 00:25:40.112 ] 00:25:40.112 }' 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:40.112 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:40.113 08:01:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:40.373 [2024-07-15 08:01:25.000336] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:40.373 [2024-07-15 08:01:25.000378] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:40.373 [2024-07-15 08:01:25.000441] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.385 "name": "raid_bdev1", 00:25:41.385 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:41.385 "strip_size_kb": 0, 00:25:41.385 "state": "online", 00:25:41.385 "raid_level": "raid1", 00:25:41.385 "superblock": true, 00:25:41.385 "num_base_bdevs": 2, 00:25:41.385 "num_base_bdevs_discovered": 2, 00:25:41.385 "num_base_bdevs_operational": 2, 00:25:41.385 "base_bdevs_list": [ 00:25:41.385 { 00:25:41.385 "name": "spare", 00:25:41.385 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:41.385 "is_configured": true, 00:25:41.385 "data_offset": 256, 00:25:41.385 "data_size": 7936 00:25:41.385 }, 00:25:41.385 { 00:25:41.385 "name": "BaseBdev2", 00:25:41.385 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:41.385 "is_configured": true, 00:25:41.385 "data_offset": 256, 00:25:41.385 "data_size": 7936 00:25:41.385 } 00:25:41.385 ] 00:25:41.385 }' 00:25:41.385 08:01:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.386 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.645 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.645 "name": "raid_bdev1", 00:25:41.645 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:41.645 "strip_size_kb": 0, 00:25:41.645 "state": "online", 00:25:41.645 "raid_level": "raid1", 00:25:41.645 "superblock": true, 00:25:41.645 "num_base_bdevs": 2, 00:25:41.645 "num_base_bdevs_discovered": 2, 00:25:41.645 "num_base_bdevs_operational": 2, 00:25:41.645 "base_bdevs_list": [ 00:25:41.645 { 00:25:41.645 "name": "spare", 00:25:41.645 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:41.645 "is_configured": true, 00:25:41.645 "data_offset": 256, 00:25:41.645 "data_size": 7936 00:25:41.645 }, 00:25:41.645 { 00:25:41.645 "name": "BaseBdev2", 00:25:41.645 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:41.645 "is_configured": true, 00:25:41.645 "data_offset": 256, 00:25:41.645 "data_size": 7936 00:25:41.645 } 00:25:41.645 ] 00:25:41.645 }' 00:25:41.645 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.646 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.905 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:41.905 "name": "raid_bdev1", 00:25:41.905 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:41.905 "strip_size_kb": 0, 00:25:41.905 "state": "online", 00:25:41.905 "raid_level": "raid1", 00:25:41.905 "superblock": true, 00:25:41.905 "num_base_bdevs": 2, 00:25:41.905 "num_base_bdevs_discovered": 2, 00:25:41.905 "num_base_bdevs_operational": 2, 00:25:41.905 "base_bdevs_list": [ 00:25:41.906 { 00:25:41.906 "name": "spare", 00:25:41.906 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:41.906 "is_configured": true, 00:25:41.906 "data_offset": 256, 00:25:41.906 "data_size": 7936 00:25:41.906 }, 00:25:41.906 { 00:25:41.906 "name": "BaseBdev2", 00:25:41.906 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:41.906 "is_configured": true, 00:25:41.906 "data_offset": 256, 00:25:41.906 "data_size": 7936 00:25:41.906 } 00:25:41.906 ] 00:25:41.906 }' 00:25:41.906 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:41.906 08:01:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:42.475 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:42.735 [2024-07-15 08:01:27.234279] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:42.735 [2024-07-15 08:01:27.234297] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:42.735 [2024-07-15 08:01:27.234338] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:42.735 [2024-07-15 08:01:27.234375] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:42.735 [2024-07-15 08:01:27.234381] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x124e4c0 name raid_bdev1, state offline 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:42.735 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:42.995 /dev/nbd0 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:42.995 1+0 records in 00:25:42.995 1+0 records out 00:25:42.995 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337444 s, 12.1 MB/s 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:42.995 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:43.255 /dev/nbd1 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:43.255 1+0 records in 00:25:43.255 1+0 records out 00:25:43.255 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274805 s, 14.9 MB/s 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:43.255 08:01:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:43.515 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:43.775 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:44.035 [2024-07-15 08:01:28.742506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:44.035 [2024-07-15 08:01:28.742536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:44.035 [2024-07-15 08:01:28.742548] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10b9d90 00:25:44.035 [2024-07-15 08:01:28.742555] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:44.035 [2024-07-15 08:01:28.743693] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:44.035 [2024-07-15 08:01:28.743720] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:44.035 [2024-07-15 08:01:28.743759] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:44.035 [2024-07-15 08:01:28.743778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:44.035 [2024-07-15 08:01:28.743848] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:44.035 spare 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.035 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.296 [2024-07-15 08:01:28.844132] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10b8ef0 00:25:44.296 [2024-07-15 08:01:28.844142] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:44.296 [2024-07-15 08:01:28.844191] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10c0e90 00:25:44.296 [2024-07-15 08:01:28.844286] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10b8ef0 00:25:44.296 [2024-07-15 08:01:28.844292] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10b8ef0 00:25:44.296 [2024-07-15 08:01:28.844349] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:44.296 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:44.296 "name": "raid_bdev1", 00:25:44.296 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:44.296 "strip_size_kb": 0, 00:25:44.296 "state": "online", 00:25:44.296 "raid_level": "raid1", 00:25:44.296 "superblock": true, 00:25:44.296 "num_base_bdevs": 2, 00:25:44.296 "num_base_bdevs_discovered": 2, 00:25:44.296 "num_base_bdevs_operational": 2, 00:25:44.296 "base_bdevs_list": [ 00:25:44.296 { 00:25:44.296 "name": "spare", 00:25:44.296 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:44.296 "is_configured": true, 00:25:44.296 "data_offset": 256, 00:25:44.296 "data_size": 7936 00:25:44.296 }, 00:25:44.296 { 00:25:44.296 "name": "BaseBdev2", 00:25:44.296 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:44.296 "is_configured": true, 00:25:44.296 "data_offset": 256, 00:25:44.296 "data_size": 7936 00:25:44.296 } 00:25:44.296 ] 00:25:44.296 }' 00:25:44.296 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:44.296 08:01:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:44.865 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:44.865 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:44.865 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:44.865 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:44.865 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:44.865 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.865 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.123 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:45.123 "name": "raid_bdev1", 00:25:45.123 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:45.123 "strip_size_kb": 0, 00:25:45.123 "state": "online", 00:25:45.123 "raid_level": "raid1", 00:25:45.123 "superblock": true, 00:25:45.123 "num_base_bdevs": 2, 00:25:45.123 "num_base_bdevs_discovered": 2, 00:25:45.123 "num_base_bdevs_operational": 2, 00:25:45.123 "base_bdevs_list": [ 00:25:45.123 { 00:25:45.123 "name": "spare", 00:25:45.123 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:45.123 "is_configured": true, 00:25:45.123 "data_offset": 256, 00:25:45.123 "data_size": 7936 00:25:45.123 }, 00:25:45.123 { 00:25:45.123 "name": "BaseBdev2", 00:25:45.123 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:45.123 "is_configured": true, 00:25:45.123 "data_offset": 256, 00:25:45.123 "data_size": 7936 00:25:45.123 } 00:25:45.123 ] 00:25:45.123 }' 00:25:45.123 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:45.123 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:45.123 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:45.123 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:45.123 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.123 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:45.383 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:45.383 08:01:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:45.641 [2024-07-15 08:01:30.170224] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:45.642 "name": "raid_bdev1", 00:25:45.642 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:45.642 "strip_size_kb": 0, 00:25:45.642 "state": "online", 00:25:45.642 "raid_level": "raid1", 00:25:45.642 "superblock": true, 00:25:45.642 "num_base_bdevs": 2, 00:25:45.642 "num_base_bdevs_discovered": 1, 00:25:45.642 "num_base_bdevs_operational": 1, 00:25:45.642 "base_bdevs_list": [ 00:25:45.642 { 00:25:45.642 "name": null, 00:25:45.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.642 "is_configured": false, 00:25:45.642 "data_offset": 256, 00:25:45.642 "data_size": 7936 00:25:45.642 }, 00:25:45.642 { 00:25:45.642 "name": "BaseBdev2", 00:25:45.642 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:45.642 "is_configured": true, 00:25:45.642 "data_offset": 256, 00:25:45.642 "data_size": 7936 00:25:45.642 } 00:25:45.642 ] 00:25:45.642 }' 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:45.642 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:46.209 08:01:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:46.468 [2024-07-15 08:01:31.116636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:46.468 [2024-07-15 08:01:31.116744] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:46.468 [2024-07-15 08:01:31.116754] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:46.468 [2024-07-15 08:01:31.116773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:46.468 [2024-07-15 08:01:31.118322] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10c0860 00:25:46.468 [2024-07-15 08:01:31.119930] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:46.468 08:01:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:47.408 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:47.408 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:47.408 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:47.408 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:47.408 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:47.408 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.408 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.668 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:47.668 "name": "raid_bdev1", 00:25:47.668 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:47.668 "strip_size_kb": 0, 00:25:47.668 "state": "online", 00:25:47.668 "raid_level": "raid1", 00:25:47.668 "superblock": true, 00:25:47.668 "num_base_bdevs": 2, 00:25:47.668 "num_base_bdevs_discovered": 2, 00:25:47.668 "num_base_bdevs_operational": 2, 00:25:47.668 "process": { 00:25:47.668 "type": "rebuild", 00:25:47.668 "target": "spare", 00:25:47.668 "progress": { 00:25:47.668 "blocks": 2816, 00:25:47.668 "percent": 35 00:25:47.668 } 00:25:47.668 }, 00:25:47.668 "base_bdevs_list": [ 00:25:47.668 { 00:25:47.668 "name": "spare", 00:25:47.668 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:47.668 "is_configured": true, 00:25:47.668 "data_offset": 256, 00:25:47.668 "data_size": 7936 00:25:47.668 }, 00:25:47.668 { 00:25:47.668 "name": "BaseBdev2", 00:25:47.668 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:47.668 "is_configured": true, 00:25:47.668 "data_offset": 256, 00:25:47.668 "data_size": 7936 00:25:47.668 } 00:25:47.668 ] 00:25:47.668 }' 00:25:47.668 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:47.668 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:47.668 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:47.927 [2024-07-15 08:01:32.604810] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:47.927 [2024-07-15 08:01:32.628681] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:47.927 [2024-07-15 08:01:32.628716] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:47.927 [2024-07-15 08:01:32.628726] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:47.927 [2024-07-15 08:01:32.628730] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.927 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.187 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:48.187 "name": "raid_bdev1", 00:25:48.187 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:48.187 "strip_size_kb": 0, 00:25:48.187 "state": "online", 00:25:48.187 "raid_level": "raid1", 00:25:48.187 "superblock": true, 00:25:48.187 "num_base_bdevs": 2, 00:25:48.187 "num_base_bdevs_discovered": 1, 00:25:48.187 "num_base_bdevs_operational": 1, 00:25:48.187 "base_bdevs_list": [ 00:25:48.187 { 00:25:48.187 "name": null, 00:25:48.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.187 "is_configured": false, 00:25:48.187 "data_offset": 256, 00:25:48.187 "data_size": 7936 00:25:48.187 }, 00:25:48.187 { 00:25:48.187 "name": "BaseBdev2", 00:25:48.187 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:48.187 "is_configured": true, 00:25:48.187 "data_offset": 256, 00:25:48.187 "data_size": 7936 00:25:48.187 } 00:25:48.187 ] 00:25:48.187 }' 00:25:48.187 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:48.187 08:01:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:48.756 08:01:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:49.015 [2024-07-15 08:01:33.528884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:49.015 [2024-07-15 08:01:33.528918] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:49.015 [2024-07-15 08:01:33.528931] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10bad20 00:25:49.015 [2024-07-15 08:01:33.528937] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:49.015 [2024-07-15 08:01:33.529103] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:49.015 [2024-07-15 08:01:33.529112] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:49.015 [2024-07-15 08:01:33.529152] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:49.015 [2024-07-15 08:01:33.529159] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:49.015 [2024-07-15 08:01:33.529165] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:49.015 [2024-07-15 08:01:33.529176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:49.015 [2024-07-15 08:01:33.530657] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10c2270 00:25:49.015 [2024-07-15 08:01:33.531780] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:49.015 spare 00:25:49.015 08:01:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:49.953 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:49.953 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:49.953 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:49.953 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:49.953 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:49.953 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.953 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.212 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.212 "name": "raid_bdev1", 00:25:50.212 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:50.212 "strip_size_kb": 0, 00:25:50.212 "state": "online", 00:25:50.212 "raid_level": "raid1", 00:25:50.212 "superblock": true, 00:25:50.212 "num_base_bdevs": 2, 00:25:50.212 "num_base_bdevs_discovered": 2, 00:25:50.212 "num_base_bdevs_operational": 2, 00:25:50.212 "process": { 00:25:50.212 "type": "rebuild", 00:25:50.212 "target": "spare", 00:25:50.212 "progress": { 00:25:50.212 "blocks": 2816, 00:25:50.212 "percent": 35 00:25:50.212 } 00:25:50.212 }, 00:25:50.212 "base_bdevs_list": [ 00:25:50.212 { 00:25:50.212 "name": "spare", 00:25:50.212 "uuid": "098932da-6436-501c-bc99-83602d351dba", 00:25:50.212 "is_configured": true, 00:25:50.212 "data_offset": 256, 00:25:50.212 "data_size": 7936 00:25:50.212 }, 00:25:50.212 { 00:25:50.212 "name": "BaseBdev2", 00:25:50.212 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:50.212 "is_configured": true, 00:25:50.212 "data_offset": 256, 00:25:50.212 "data_size": 7936 00:25:50.212 } 00:25:50.212 ] 00:25:50.212 }' 00:25:50.212 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.212 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:50.212 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.212 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:50.212 08:01:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:50.471 [2024-07-15 08:01:35.021336] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:50.471 [2024-07-15 08:01:35.040655] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:50.471 [2024-07-15 08:01:35.040684] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:50.471 [2024-07-15 08:01:35.040694] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:50.471 [2024-07-15 08:01:35.040698] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.471 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.729 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:50.729 "name": "raid_bdev1", 00:25:50.729 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:50.729 "strip_size_kb": 0, 00:25:50.729 "state": "online", 00:25:50.729 "raid_level": "raid1", 00:25:50.729 "superblock": true, 00:25:50.729 "num_base_bdevs": 2, 00:25:50.729 "num_base_bdevs_discovered": 1, 00:25:50.729 "num_base_bdevs_operational": 1, 00:25:50.729 "base_bdevs_list": [ 00:25:50.729 { 00:25:50.729 "name": null, 00:25:50.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:50.729 "is_configured": false, 00:25:50.729 "data_offset": 256, 00:25:50.729 "data_size": 7936 00:25:50.729 }, 00:25:50.729 { 00:25:50.729 "name": "BaseBdev2", 00:25:50.729 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:50.729 "is_configured": true, 00:25:50.729 "data_offset": 256, 00:25:50.729 "data_size": 7936 00:25:50.729 } 00:25:50.729 ] 00:25:50.729 }' 00:25:50.729 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:50.729 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:51.299 "name": "raid_bdev1", 00:25:51.299 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:51.299 "strip_size_kb": 0, 00:25:51.299 "state": "online", 00:25:51.299 "raid_level": "raid1", 00:25:51.299 "superblock": true, 00:25:51.299 "num_base_bdevs": 2, 00:25:51.299 "num_base_bdevs_discovered": 1, 00:25:51.299 "num_base_bdevs_operational": 1, 00:25:51.299 "base_bdevs_list": [ 00:25:51.299 { 00:25:51.299 "name": null, 00:25:51.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.299 "is_configured": false, 00:25:51.299 "data_offset": 256, 00:25:51.299 "data_size": 7936 00:25:51.299 }, 00:25:51.299 { 00:25:51.299 "name": "BaseBdev2", 00:25:51.299 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:51.299 "is_configured": true, 00:25:51.299 "data_offset": 256, 00:25:51.299 "data_size": 7936 00:25:51.299 } 00:25:51.299 ] 00:25:51.299 }' 00:25:51.299 08:01:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:51.299 08:01:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:51.299 08:01:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:51.559 08:01:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:51.559 08:01:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:51.559 08:01:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:51.818 [2024-07-15 08:01:36.410030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:51.818 [2024-07-15 08:01:36.410062] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:51.818 [2024-07-15 08:01:36.410075] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10c28f0 00:25:51.818 [2024-07-15 08:01:36.410081] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:51.818 [2024-07-15 08:01:36.410220] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:51.818 [2024-07-15 08:01:36.410230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:51.818 [2024-07-15 08:01:36.410260] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:51.818 [2024-07-15 08:01:36.410267] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:51.818 [2024-07-15 08:01:36.410272] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:51.818 BaseBdev1 00:25:51.818 08:01:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:52.754 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.755 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.015 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.015 "name": "raid_bdev1", 00:25:53.015 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:53.015 "strip_size_kb": 0, 00:25:53.015 "state": "online", 00:25:53.015 "raid_level": "raid1", 00:25:53.015 "superblock": true, 00:25:53.015 "num_base_bdevs": 2, 00:25:53.015 "num_base_bdevs_discovered": 1, 00:25:53.015 "num_base_bdevs_operational": 1, 00:25:53.015 "base_bdevs_list": [ 00:25:53.015 { 00:25:53.015 "name": null, 00:25:53.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.015 "is_configured": false, 00:25:53.015 "data_offset": 256, 00:25:53.015 "data_size": 7936 00:25:53.015 }, 00:25:53.015 { 00:25:53.015 "name": "BaseBdev2", 00:25:53.015 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:53.015 "is_configured": true, 00:25:53.015 "data_offset": 256, 00:25:53.015 "data_size": 7936 00:25:53.015 } 00:25:53.015 ] 00:25:53.015 }' 00:25:53.015 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.015 08:01:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:53.583 "name": "raid_bdev1", 00:25:53.583 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:53.583 "strip_size_kb": 0, 00:25:53.583 "state": "online", 00:25:53.583 "raid_level": "raid1", 00:25:53.583 "superblock": true, 00:25:53.583 "num_base_bdevs": 2, 00:25:53.583 "num_base_bdevs_discovered": 1, 00:25:53.583 "num_base_bdevs_operational": 1, 00:25:53.583 "base_bdevs_list": [ 00:25:53.583 { 00:25:53.583 "name": null, 00:25:53.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.583 "is_configured": false, 00:25:53.583 "data_offset": 256, 00:25:53.583 "data_size": 7936 00:25:53.583 }, 00:25:53.583 { 00:25:53.583 "name": "BaseBdev2", 00:25:53.583 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:53.583 "is_configured": true, 00:25:53.583 "data_offset": 256, 00:25:53.583 "data_size": 7936 00:25:53.583 } 00:25:53.583 ] 00:25:53.583 }' 00:25:53.583 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:53.843 [2024-07-15 08:01:38.571493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:53.843 [2024-07-15 08:01:38.571580] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:53.843 [2024-07-15 08:01:38.571588] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:53.843 request: 00:25:53.843 { 00:25:53.843 "base_bdev": "BaseBdev1", 00:25:53.843 "raid_bdev": "raid_bdev1", 00:25:53.843 "method": "bdev_raid_add_base_bdev", 00:25:53.843 "req_id": 1 00:25:53.843 } 00:25:53.843 Got JSON-RPC error response 00:25:53.843 response: 00:25:53.843 { 00:25:53.843 "code": -22, 00:25:53.843 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:53.843 } 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:53.843 08:01:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.223 "name": "raid_bdev1", 00:25:55.223 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:55.223 "strip_size_kb": 0, 00:25:55.223 "state": "online", 00:25:55.223 "raid_level": "raid1", 00:25:55.223 "superblock": true, 00:25:55.223 "num_base_bdevs": 2, 00:25:55.223 "num_base_bdevs_discovered": 1, 00:25:55.223 "num_base_bdevs_operational": 1, 00:25:55.223 "base_bdevs_list": [ 00:25:55.223 { 00:25:55.223 "name": null, 00:25:55.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.223 "is_configured": false, 00:25:55.223 "data_offset": 256, 00:25:55.223 "data_size": 7936 00:25:55.223 }, 00:25:55.223 { 00:25:55.223 "name": "BaseBdev2", 00:25:55.223 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:55.223 "is_configured": true, 00:25:55.223 "data_offset": 256, 00:25:55.223 "data_size": 7936 00:25:55.223 } 00:25:55.223 ] 00:25:55.223 }' 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.223 08:01:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:55.792 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:55.792 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.793 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:55.793 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:55.793 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.793 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.793 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.793 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.793 "name": "raid_bdev1", 00:25:55.793 "uuid": "c027def7-02d2-4867-aab5-eda6eaebf751", 00:25:55.793 "strip_size_kb": 0, 00:25:55.793 "state": "online", 00:25:55.793 "raid_level": "raid1", 00:25:55.793 "superblock": true, 00:25:55.793 "num_base_bdevs": 2, 00:25:55.793 "num_base_bdevs_discovered": 1, 00:25:55.793 "num_base_bdevs_operational": 1, 00:25:55.793 "base_bdevs_list": [ 00:25:55.793 { 00:25:55.793 "name": null, 00:25:55.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.793 "is_configured": false, 00:25:55.793 "data_offset": 256, 00:25:55.793 "data_size": 7936 00:25:55.793 }, 00:25:55.793 { 00:25:55.793 "name": "BaseBdev2", 00:25:55.793 "uuid": "b9deb914-0781-5639-89b7-e23dfe941d84", 00:25:55.793 "is_configured": true, 00:25:55.793 "data_offset": 256, 00:25:55.793 "data_size": 7936 00:25:55.793 } 00:25:55.793 ] 00:25:55.793 }' 00:25:55.793 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 1756065 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1756065 ']' 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1756065 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1756065 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1756065' 00:25:56.052 killing process with pid 1756065 00:25:56.052 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1756065 00:25:56.052 Received shutdown signal, test time was about 60.000000 seconds 00:25:56.052 00:25:56.052 Latency(us) 00:25:56.052 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:56.052 =================================================================================================================== 00:25:56.052 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:56.052 [2024-07-15 08:01:40.647900] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:56.052 [2024-07-15 08:01:40.647961] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:56.053 [2024-07-15 08:01:40.647991] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:56.053 [2024-07-15 08:01:40.647998] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10b8ef0 name raid_bdev1, state offline 00:25:56.053 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1756065 00:25:56.053 [2024-07-15 08:01:40.666542] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:56.053 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:25:56.053 00:25:56.053 real 0m27.389s 00:25:56.053 user 0m42.788s 00:25:56.053 sys 0m3.519s 00:25:56.053 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:56.053 08:01:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:25:56.053 ************************************ 00:25:56.053 END TEST raid_rebuild_test_sb_md_separate 00:25:56.053 ************************************ 00:25:56.313 08:01:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:56.313 08:01:40 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:25:56.313 08:01:40 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:25:56.313 08:01:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:25:56.313 08:01:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:56.313 08:01:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:56.313 ************************************ 00:25:56.313 START TEST raid_state_function_test_sb_md_interleaved 00:25:56.313 ************************************ 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1761110 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1761110' 00:25:56.313 Process raid pid: 1761110 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1761110 /var/tmp/spdk-raid.sock 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1761110 ']' 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:56.313 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:56.313 08:01:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:56.313 [2024-07-15 08:01:40.929377] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:25:56.313 [2024-07-15 08:01:40.929426] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:56.313 [2024-07-15 08:01:41.019366] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:56.572 [2024-07-15 08:01:41.087160] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:56.572 [2024-07-15 08:01:41.135613] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:56.572 [2024-07-15 08:01:41.135633] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:57.140 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:57.140 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:25:57.140 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:57.400 [2024-07-15 08:01:41.942840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:57.400 [2024-07-15 08:01:41.942868] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:57.400 [2024-07-15 08:01:41.942874] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:57.400 [2024-07-15 08:01:41.942880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:57.400 08:01:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.400 08:01:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:57.400 "name": "Existed_Raid", 00:25:57.400 "uuid": "4e55ed26-c941-442d-9c07-d920600183d2", 00:25:57.400 "strip_size_kb": 0, 00:25:57.400 "state": "configuring", 00:25:57.400 "raid_level": "raid1", 00:25:57.400 "superblock": true, 00:25:57.400 "num_base_bdevs": 2, 00:25:57.400 "num_base_bdevs_discovered": 0, 00:25:57.400 "num_base_bdevs_operational": 2, 00:25:57.400 "base_bdevs_list": [ 00:25:57.400 { 00:25:57.400 "name": "BaseBdev1", 00:25:57.400 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.400 "is_configured": false, 00:25:57.400 "data_offset": 0, 00:25:57.400 "data_size": 0 00:25:57.400 }, 00:25:57.400 { 00:25:57.400 "name": "BaseBdev2", 00:25:57.400 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.400 "is_configured": false, 00:25:57.400 "data_offset": 0, 00:25:57.400 "data_size": 0 00:25:57.400 } 00:25:57.400 ] 00:25:57.400 }' 00:25:57.400 08:01:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:57.400 08:01:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:57.970 08:01:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:58.229 [2024-07-15 08:01:42.877083] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:58.229 [2024-07-15 08:01:42.877101] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe346b0 name Existed_Raid, state configuring 00:25:58.229 08:01:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:58.489 [2024-07-15 08:01:43.061568] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:58.489 [2024-07-15 08:01:43.061587] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:58.489 [2024-07-15 08:01:43.061592] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:58.489 [2024-07-15 08:01:43.061597] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:58.489 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:25:58.785 [2024-07-15 08:01:43.252653] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:58.785 BaseBdev1 00:25:58.785 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:58.785 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:25:58.785 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:25:58.785 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:25:58.785 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:25:58.785 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:25:58.785 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:58.785 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:59.045 [ 00:25:59.045 { 00:25:59.045 "name": "BaseBdev1", 00:25:59.045 "aliases": [ 00:25:59.045 "8aecdfc8-6b4a-410e-9bea-b2c6e9b8b243" 00:25:59.045 ], 00:25:59.045 "product_name": "Malloc disk", 00:25:59.045 "block_size": 4128, 00:25:59.045 "num_blocks": 8192, 00:25:59.045 "uuid": "8aecdfc8-6b4a-410e-9bea-b2c6e9b8b243", 00:25:59.045 "md_size": 32, 00:25:59.045 "md_interleave": true, 00:25:59.045 "dif_type": 0, 00:25:59.045 "assigned_rate_limits": { 00:25:59.045 "rw_ios_per_sec": 0, 00:25:59.045 "rw_mbytes_per_sec": 0, 00:25:59.045 "r_mbytes_per_sec": 0, 00:25:59.045 "w_mbytes_per_sec": 0 00:25:59.045 }, 00:25:59.045 "claimed": true, 00:25:59.045 "claim_type": "exclusive_write", 00:25:59.045 "zoned": false, 00:25:59.045 "supported_io_types": { 00:25:59.045 "read": true, 00:25:59.045 "write": true, 00:25:59.045 "unmap": true, 00:25:59.045 "flush": true, 00:25:59.045 "reset": true, 00:25:59.045 "nvme_admin": false, 00:25:59.045 "nvme_io": false, 00:25:59.045 "nvme_io_md": false, 00:25:59.045 "write_zeroes": true, 00:25:59.045 "zcopy": true, 00:25:59.045 "get_zone_info": false, 00:25:59.045 "zone_management": false, 00:25:59.045 "zone_append": false, 00:25:59.045 "compare": false, 00:25:59.045 "compare_and_write": false, 00:25:59.045 "abort": true, 00:25:59.045 "seek_hole": false, 00:25:59.045 "seek_data": false, 00:25:59.045 "copy": true, 00:25:59.045 "nvme_iov_md": false 00:25:59.045 }, 00:25:59.045 "memory_domains": [ 00:25:59.045 { 00:25:59.045 "dma_device_id": "system", 00:25:59.045 "dma_device_type": 1 00:25:59.045 }, 00:25:59.045 { 00:25:59.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:59.045 "dma_device_type": 2 00:25:59.045 } 00:25:59.045 ], 00:25:59.045 "driver_specific": {} 00:25:59.045 } 00:25:59.045 ] 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.045 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:59.305 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:59.305 "name": "Existed_Raid", 00:25:59.305 "uuid": "6b896d4a-ff92-42d7-8ad2-e01482977d4c", 00:25:59.305 "strip_size_kb": 0, 00:25:59.305 "state": "configuring", 00:25:59.305 "raid_level": "raid1", 00:25:59.305 "superblock": true, 00:25:59.305 "num_base_bdevs": 2, 00:25:59.305 "num_base_bdevs_discovered": 1, 00:25:59.305 "num_base_bdevs_operational": 2, 00:25:59.305 "base_bdevs_list": [ 00:25:59.305 { 00:25:59.305 "name": "BaseBdev1", 00:25:59.305 "uuid": "8aecdfc8-6b4a-410e-9bea-b2c6e9b8b243", 00:25:59.305 "is_configured": true, 00:25:59.305 "data_offset": 256, 00:25:59.305 "data_size": 7936 00:25:59.305 }, 00:25:59.305 { 00:25:59.305 "name": "BaseBdev2", 00:25:59.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.305 "is_configured": false, 00:25:59.305 "data_offset": 0, 00:25:59.305 "data_size": 0 00:25:59.305 } 00:25:59.305 ] 00:25:59.305 }' 00:25:59.305 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:59.305 08:01:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:59.875 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:59.875 [2024-07-15 08:01:44.527894] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:59.875 [2024-07-15 08:01:44.527919] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe33fa0 name Existed_Raid, state configuring 00:25:59.875 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:00.135 [2024-07-15 08:01:44.716400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:00.135 [2024-07-15 08:01:44.717528] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:00.135 [2024-07-15 08:01:44.717551] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.135 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:00.394 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:00.394 "name": "Existed_Raid", 00:26:00.394 "uuid": "08387f85-583c-4405-a565-91397703d66a", 00:26:00.394 "strip_size_kb": 0, 00:26:00.394 "state": "configuring", 00:26:00.394 "raid_level": "raid1", 00:26:00.394 "superblock": true, 00:26:00.394 "num_base_bdevs": 2, 00:26:00.394 "num_base_bdevs_discovered": 1, 00:26:00.394 "num_base_bdevs_operational": 2, 00:26:00.394 "base_bdevs_list": [ 00:26:00.394 { 00:26:00.394 "name": "BaseBdev1", 00:26:00.394 "uuid": "8aecdfc8-6b4a-410e-9bea-b2c6e9b8b243", 00:26:00.394 "is_configured": true, 00:26:00.394 "data_offset": 256, 00:26:00.394 "data_size": 7936 00:26:00.394 }, 00:26:00.394 { 00:26:00.394 "name": "BaseBdev2", 00:26:00.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.394 "is_configured": false, 00:26:00.394 "data_offset": 0, 00:26:00.394 "data_size": 0 00:26:00.394 } 00:26:00.394 ] 00:26:00.394 }' 00:26:00.394 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:00.394 08:01:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:00.656 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:26:00.917 [2024-07-15 08:01:45.587692] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:00.917 [2024-07-15 08:01:45.587793] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe35e40 00:26:00.917 [2024-07-15 08:01:45.587801] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:00.917 [2024-07-15 08:01:45.587842] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe336a0 00:26:00.917 [2024-07-15 08:01:45.587902] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe35e40 00:26:00.917 [2024-07-15 08:01:45.587908] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe35e40 00:26:00.917 [2024-07-15 08:01:45.587947] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:00.917 BaseBdev2 00:26:00.917 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:00.917 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:00.917 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:00.917 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:26:00.917 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:00.917 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:00.917 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:01.176 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:01.436 [ 00:26:01.436 { 00:26:01.436 "name": "BaseBdev2", 00:26:01.436 "aliases": [ 00:26:01.436 "793ced5b-a648-4241-aa7e-89fa975b2887" 00:26:01.436 ], 00:26:01.436 "product_name": "Malloc disk", 00:26:01.436 "block_size": 4128, 00:26:01.436 "num_blocks": 8192, 00:26:01.436 "uuid": "793ced5b-a648-4241-aa7e-89fa975b2887", 00:26:01.436 "md_size": 32, 00:26:01.436 "md_interleave": true, 00:26:01.436 "dif_type": 0, 00:26:01.436 "assigned_rate_limits": { 00:26:01.436 "rw_ios_per_sec": 0, 00:26:01.436 "rw_mbytes_per_sec": 0, 00:26:01.436 "r_mbytes_per_sec": 0, 00:26:01.436 "w_mbytes_per_sec": 0 00:26:01.436 }, 00:26:01.436 "claimed": true, 00:26:01.436 "claim_type": "exclusive_write", 00:26:01.436 "zoned": false, 00:26:01.436 "supported_io_types": { 00:26:01.436 "read": true, 00:26:01.436 "write": true, 00:26:01.436 "unmap": true, 00:26:01.436 "flush": true, 00:26:01.436 "reset": true, 00:26:01.436 "nvme_admin": false, 00:26:01.436 "nvme_io": false, 00:26:01.436 "nvme_io_md": false, 00:26:01.436 "write_zeroes": true, 00:26:01.436 "zcopy": true, 00:26:01.436 "get_zone_info": false, 00:26:01.436 "zone_management": false, 00:26:01.436 "zone_append": false, 00:26:01.436 "compare": false, 00:26:01.436 "compare_and_write": false, 00:26:01.436 "abort": true, 00:26:01.436 "seek_hole": false, 00:26:01.436 "seek_data": false, 00:26:01.436 "copy": true, 00:26:01.436 "nvme_iov_md": false 00:26:01.436 }, 00:26:01.436 "memory_domains": [ 00:26:01.436 { 00:26:01.436 "dma_device_id": "system", 00:26:01.436 "dma_device_type": 1 00:26:01.436 }, 00:26:01.436 { 00:26:01.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:01.436 "dma_device_type": 2 00:26:01.436 } 00:26:01.436 ], 00:26:01.436 "driver_specific": {} 00:26:01.436 } 00:26:01.436 ] 00:26:01.436 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:26:01.436 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:01.436 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:01.436 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.437 08:01:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:01.437 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.437 "name": "Existed_Raid", 00:26:01.437 "uuid": "08387f85-583c-4405-a565-91397703d66a", 00:26:01.437 "strip_size_kb": 0, 00:26:01.437 "state": "online", 00:26:01.437 "raid_level": "raid1", 00:26:01.437 "superblock": true, 00:26:01.437 "num_base_bdevs": 2, 00:26:01.437 "num_base_bdevs_discovered": 2, 00:26:01.437 "num_base_bdevs_operational": 2, 00:26:01.437 "base_bdevs_list": [ 00:26:01.437 { 00:26:01.437 "name": "BaseBdev1", 00:26:01.437 "uuid": "8aecdfc8-6b4a-410e-9bea-b2c6e9b8b243", 00:26:01.437 "is_configured": true, 00:26:01.437 "data_offset": 256, 00:26:01.437 "data_size": 7936 00:26:01.437 }, 00:26:01.437 { 00:26:01.437 "name": "BaseBdev2", 00:26:01.437 "uuid": "793ced5b-a648-4241-aa7e-89fa975b2887", 00:26:01.437 "is_configured": true, 00:26:01.437 "data_offset": 256, 00:26:01.437 "data_size": 7936 00:26:01.437 } 00:26:01.437 ] 00:26:01.437 }' 00:26:01.437 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.437 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:02.006 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:02.006 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:02.006 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:02.006 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:02.006 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:02.006 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:26:02.006 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:02.006 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:02.264 [2024-07-15 08:01:46.903259] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:02.264 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:02.264 "name": "Existed_Raid", 00:26:02.264 "aliases": [ 00:26:02.264 "08387f85-583c-4405-a565-91397703d66a" 00:26:02.264 ], 00:26:02.264 "product_name": "Raid Volume", 00:26:02.264 "block_size": 4128, 00:26:02.264 "num_blocks": 7936, 00:26:02.264 "uuid": "08387f85-583c-4405-a565-91397703d66a", 00:26:02.264 "md_size": 32, 00:26:02.264 "md_interleave": true, 00:26:02.265 "dif_type": 0, 00:26:02.265 "assigned_rate_limits": { 00:26:02.265 "rw_ios_per_sec": 0, 00:26:02.265 "rw_mbytes_per_sec": 0, 00:26:02.265 "r_mbytes_per_sec": 0, 00:26:02.265 "w_mbytes_per_sec": 0 00:26:02.265 }, 00:26:02.265 "claimed": false, 00:26:02.265 "zoned": false, 00:26:02.265 "supported_io_types": { 00:26:02.265 "read": true, 00:26:02.265 "write": true, 00:26:02.265 "unmap": false, 00:26:02.265 "flush": false, 00:26:02.265 "reset": true, 00:26:02.265 "nvme_admin": false, 00:26:02.265 "nvme_io": false, 00:26:02.265 "nvme_io_md": false, 00:26:02.265 "write_zeroes": true, 00:26:02.265 "zcopy": false, 00:26:02.265 "get_zone_info": false, 00:26:02.265 "zone_management": false, 00:26:02.265 "zone_append": false, 00:26:02.265 "compare": false, 00:26:02.265 "compare_and_write": false, 00:26:02.265 "abort": false, 00:26:02.265 "seek_hole": false, 00:26:02.265 "seek_data": false, 00:26:02.265 "copy": false, 00:26:02.265 "nvme_iov_md": false 00:26:02.265 }, 00:26:02.265 "memory_domains": [ 00:26:02.265 { 00:26:02.265 "dma_device_id": "system", 00:26:02.265 "dma_device_type": 1 00:26:02.265 }, 00:26:02.265 { 00:26:02.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.265 "dma_device_type": 2 00:26:02.265 }, 00:26:02.265 { 00:26:02.265 "dma_device_id": "system", 00:26:02.265 "dma_device_type": 1 00:26:02.265 }, 00:26:02.265 { 00:26:02.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.265 "dma_device_type": 2 00:26:02.265 } 00:26:02.265 ], 00:26:02.265 "driver_specific": { 00:26:02.265 "raid": { 00:26:02.265 "uuid": "08387f85-583c-4405-a565-91397703d66a", 00:26:02.265 "strip_size_kb": 0, 00:26:02.265 "state": "online", 00:26:02.265 "raid_level": "raid1", 00:26:02.265 "superblock": true, 00:26:02.265 "num_base_bdevs": 2, 00:26:02.265 "num_base_bdevs_discovered": 2, 00:26:02.265 "num_base_bdevs_operational": 2, 00:26:02.265 "base_bdevs_list": [ 00:26:02.265 { 00:26:02.265 "name": "BaseBdev1", 00:26:02.265 "uuid": "8aecdfc8-6b4a-410e-9bea-b2c6e9b8b243", 00:26:02.265 "is_configured": true, 00:26:02.265 "data_offset": 256, 00:26:02.265 "data_size": 7936 00:26:02.265 }, 00:26:02.265 { 00:26:02.265 "name": "BaseBdev2", 00:26:02.265 "uuid": "793ced5b-a648-4241-aa7e-89fa975b2887", 00:26:02.265 "is_configured": true, 00:26:02.265 "data_offset": 256, 00:26:02.265 "data_size": 7936 00:26:02.265 } 00:26:02.265 ] 00:26:02.265 } 00:26:02.265 } 00:26:02.265 }' 00:26:02.265 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:02.265 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:02.265 BaseBdev2' 00:26:02.265 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:02.265 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:02.265 08:01:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:02.524 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:02.524 "name": "BaseBdev1", 00:26:02.524 "aliases": [ 00:26:02.524 "8aecdfc8-6b4a-410e-9bea-b2c6e9b8b243" 00:26:02.524 ], 00:26:02.524 "product_name": "Malloc disk", 00:26:02.524 "block_size": 4128, 00:26:02.524 "num_blocks": 8192, 00:26:02.524 "uuid": "8aecdfc8-6b4a-410e-9bea-b2c6e9b8b243", 00:26:02.524 "md_size": 32, 00:26:02.524 "md_interleave": true, 00:26:02.524 "dif_type": 0, 00:26:02.524 "assigned_rate_limits": { 00:26:02.524 "rw_ios_per_sec": 0, 00:26:02.524 "rw_mbytes_per_sec": 0, 00:26:02.524 "r_mbytes_per_sec": 0, 00:26:02.524 "w_mbytes_per_sec": 0 00:26:02.524 }, 00:26:02.524 "claimed": true, 00:26:02.524 "claim_type": "exclusive_write", 00:26:02.524 "zoned": false, 00:26:02.524 "supported_io_types": { 00:26:02.524 "read": true, 00:26:02.524 "write": true, 00:26:02.524 "unmap": true, 00:26:02.524 "flush": true, 00:26:02.524 "reset": true, 00:26:02.524 "nvme_admin": false, 00:26:02.524 "nvme_io": false, 00:26:02.524 "nvme_io_md": false, 00:26:02.524 "write_zeroes": true, 00:26:02.524 "zcopy": true, 00:26:02.524 "get_zone_info": false, 00:26:02.524 "zone_management": false, 00:26:02.524 "zone_append": false, 00:26:02.524 "compare": false, 00:26:02.524 "compare_and_write": false, 00:26:02.524 "abort": true, 00:26:02.524 "seek_hole": false, 00:26:02.524 "seek_data": false, 00:26:02.524 "copy": true, 00:26:02.524 "nvme_iov_md": false 00:26:02.524 }, 00:26:02.524 "memory_domains": [ 00:26:02.524 { 00:26:02.524 "dma_device_id": "system", 00:26:02.524 "dma_device_type": 1 00:26:02.524 }, 00:26:02.524 { 00:26:02.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:02.524 "dma_device_type": 2 00:26:02.524 } 00:26:02.524 ], 00:26:02.524 "driver_specific": {} 00:26:02.524 }' 00:26:02.524 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:02.524 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:02.524 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:02.524 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:02.524 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:02.784 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:03.044 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:03.044 "name": "BaseBdev2", 00:26:03.044 "aliases": [ 00:26:03.044 "793ced5b-a648-4241-aa7e-89fa975b2887" 00:26:03.044 ], 00:26:03.044 "product_name": "Malloc disk", 00:26:03.044 "block_size": 4128, 00:26:03.044 "num_blocks": 8192, 00:26:03.044 "uuid": "793ced5b-a648-4241-aa7e-89fa975b2887", 00:26:03.044 "md_size": 32, 00:26:03.044 "md_interleave": true, 00:26:03.044 "dif_type": 0, 00:26:03.044 "assigned_rate_limits": { 00:26:03.044 "rw_ios_per_sec": 0, 00:26:03.044 "rw_mbytes_per_sec": 0, 00:26:03.044 "r_mbytes_per_sec": 0, 00:26:03.044 "w_mbytes_per_sec": 0 00:26:03.044 }, 00:26:03.044 "claimed": true, 00:26:03.044 "claim_type": "exclusive_write", 00:26:03.044 "zoned": false, 00:26:03.044 "supported_io_types": { 00:26:03.044 "read": true, 00:26:03.044 "write": true, 00:26:03.044 "unmap": true, 00:26:03.044 "flush": true, 00:26:03.044 "reset": true, 00:26:03.044 "nvme_admin": false, 00:26:03.044 "nvme_io": false, 00:26:03.044 "nvme_io_md": false, 00:26:03.044 "write_zeroes": true, 00:26:03.044 "zcopy": true, 00:26:03.044 "get_zone_info": false, 00:26:03.044 "zone_management": false, 00:26:03.044 "zone_append": false, 00:26:03.044 "compare": false, 00:26:03.044 "compare_and_write": false, 00:26:03.044 "abort": true, 00:26:03.044 "seek_hole": false, 00:26:03.044 "seek_data": false, 00:26:03.044 "copy": true, 00:26:03.044 "nvme_iov_md": false 00:26:03.044 }, 00:26:03.044 "memory_domains": [ 00:26:03.044 { 00:26:03.044 "dma_device_id": "system", 00:26:03.044 "dma_device_type": 1 00:26:03.044 }, 00:26:03.044 { 00:26:03.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:03.044 "dma_device_type": 2 00:26:03.044 } 00:26:03.044 ], 00:26:03.044 "driver_specific": {} 00:26:03.044 }' 00:26:03.044 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:03.044 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:03.044 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:03.044 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:03.320 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:03.320 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:03.320 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:03.320 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:03.320 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:03.320 08:01:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:03.320 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:03.320 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:03.320 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:03.580 [2024-07-15 08:01:48.230438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.580 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:03.840 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:03.840 "name": "Existed_Raid", 00:26:03.840 "uuid": "08387f85-583c-4405-a565-91397703d66a", 00:26:03.840 "strip_size_kb": 0, 00:26:03.840 "state": "online", 00:26:03.840 "raid_level": "raid1", 00:26:03.840 "superblock": true, 00:26:03.840 "num_base_bdevs": 2, 00:26:03.840 "num_base_bdevs_discovered": 1, 00:26:03.840 "num_base_bdevs_operational": 1, 00:26:03.840 "base_bdevs_list": [ 00:26:03.840 { 00:26:03.840 "name": null, 00:26:03.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.840 "is_configured": false, 00:26:03.840 "data_offset": 256, 00:26:03.840 "data_size": 7936 00:26:03.840 }, 00:26:03.840 { 00:26:03.840 "name": "BaseBdev2", 00:26:03.840 "uuid": "793ced5b-a648-4241-aa7e-89fa975b2887", 00:26:03.840 "is_configured": true, 00:26:03.840 "data_offset": 256, 00:26:03.840 "data_size": 7936 00:26:03.840 } 00:26:03.840 ] 00:26:03.840 }' 00:26:03.840 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:03.840 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:04.409 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:04.409 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:04.409 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.409 08:01:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:04.669 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:04.669 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:04.669 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:04.669 [2024-07-15 08:01:49.369315] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:04.669 [2024-07-15 08:01:49.369374] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:04.669 [2024-07-15 08:01:49.375631] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:04.669 [2024-07-15 08:01:49.375656] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:04.669 [2024-07-15 08:01:49.375662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe35e40 name Existed_Raid, state offline 00:26:04.669 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:04.669 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:04.669 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.669 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1761110 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1761110 ']' 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1761110 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1761110 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1761110' 00:26:04.929 killing process with pid 1761110 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1761110 00:26:04.929 [2024-07-15 08:01:49.631184] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:04.929 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1761110 00:26:04.929 [2024-07-15 08:01:49.631789] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:05.188 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:26:05.188 00:26:05.188 real 0m8.887s 00:26:05.188 user 0m16.105s 00:26:05.188 sys 0m1.395s 00:26:05.188 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:05.189 08:01:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:05.189 ************************************ 00:26:05.189 END TEST raid_state_function_test_sb_md_interleaved 00:26:05.189 ************************************ 00:26:05.189 08:01:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:05.189 08:01:49 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:26:05.189 08:01:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:26:05.189 08:01:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:05.189 08:01:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:05.189 ************************************ 00:26:05.189 START TEST raid_superblock_test_md_interleaved 00:26:05.189 ************************************ 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=1762823 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 1762823 /var/tmp/spdk-raid.sock 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1762823 ']' 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:05.189 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:05.189 08:01:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:05.189 [2024-07-15 08:01:49.900129] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:05.189 [2024-07-15 08:01:49.900185] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1762823 ] 00:26:05.448 [2024-07-15 08:01:49.991689] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:05.448 [2024-07-15 08:01:50.061278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:05.448 [2024-07-15 08:01:50.105071] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:05.448 [2024-07-15 08:01:50.105096] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:06.017 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:26:06.276 malloc1 00:26:06.276 08:01:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:06.535 [2024-07-15 08:01:51.115591] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:06.535 [2024-07-15 08:01:51.115625] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:06.535 [2024-07-15 08:01:51.115637] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dcf370 00:26:06.535 [2024-07-15 08:01:51.115643] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:06.535 [2024-07-15 08:01:51.116802] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:06.535 [2024-07-15 08:01:51.116821] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:06.535 pt1 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:06.535 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:26:06.794 malloc2 00:26:06.794 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:06.794 [2024-07-15 08:01:51.486618] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:06.794 [2024-07-15 08:01:51.486646] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:06.794 [2024-07-15 08:01:51.486656] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f5cb10 00:26:06.794 [2024-07-15 08:01:51.486662] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:06.794 [2024-07-15 08:01:51.487708] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:06.794 [2024-07-15 08:01:51.487735] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:06.794 pt2 00:26:06.794 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:06.794 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:06.794 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:07.054 [2024-07-15 08:01:51.679109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:07.054 [2024-07-15 08:01:51.680203] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:07.054 [2024-07-15 08:01:51.680317] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f5e480 00:26:07.054 [2024-07-15 08:01:51.680325] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:07.054 [2024-07-15 08:01:51.680367] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dcd560 00:26:07.054 [2024-07-15 08:01:51.680428] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f5e480 00:26:07.054 [2024-07-15 08:01:51.680433] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f5e480 00:26:07.054 [2024-07-15 08:01:51.680472] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.054 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.313 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:07.313 "name": "raid_bdev1", 00:26:07.313 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:07.313 "strip_size_kb": 0, 00:26:07.313 "state": "online", 00:26:07.313 "raid_level": "raid1", 00:26:07.313 "superblock": true, 00:26:07.313 "num_base_bdevs": 2, 00:26:07.313 "num_base_bdevs_discovered": 2, 00:26:07.313 "num_base_bdevs_operational": 2, 00:26:07.313 "base_bdevs_list": [ 00:26:07.313 { 00:26:07.313 "name": "pt1", 00:26:07.313 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:07.313 "is_configured": true, 00:26:07.313 "data_offset": 256, 00:26:07.313 "data_size": 7936 00:26:07.313 }, 00:26:07.313 { 00:26:07.313 "name": "pt2", 00:26:07.313 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:07.313 "is_configured": true, 00:26:07.313 "data_offset": 256, 00:26:07.313 "data_size": 7936 00:26:07.313 } 00:26:07.313 ] 00:26:07.313 }' 00:26:07.313 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:07.313 08:01:51 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:07.881 [2024-07-15 08:01:52.605642] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:07.881 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:07.881 "name": "raid_bdev1", 00:26:07.881 "aliases": [ 00:26:07.881 "cb4c008c-2207-4bb3-b10d-4c385c8ea58f" 00:26:07.881 ], 00:26:07.881 "product_name": "Raid Volume", 00:26:07.881 "block_size": 4128, 00:26:07.881 "num_blocks": 7936, 00:26:07.881 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:07.881 "md_size": 32, 00:26:07.881 "md_interleave": true, 00:26:07.881 "dif_type": 0, 00:26:07.881 "assigned_rate_limits": { 00:26:07.881 "rw_ios_per_sec": 0, 00:26:07.881 "rw_mbytes_per_sec": 0, 00:26:07.881 "r_mbytes_per_sec": 0, 00:26:07.881 "w_mbytes_per_sec": 0 00:26:07.881 }, 00:26:07.881 "claimed": false, 00:26:07.881 "zoned": false, 00:26:07.881 "supported_io_types": { 00:26:07.881 "read": true, 00:26:07.881 "write": true, 00:26:07.881 "unmap": false, 00:26:07.881 "flush": false, 00:26:07.881 "reset": true, 00:26:07.881 "nvme_admin": false, 00:26:07.881 "nvme_io": false, 00:26:07.882 "nvme_io_md": false, 00:26:07.882 "write_zeroes": true, 00:26:07.882 "zcopy": false, 00:26:07.882 "get_zone_info": false, 00:26:07.882 "zone_management": false, 00:26:07.882 "zone_append": false, 00:26:07.882 "compare": false, 00:26:07.882 "compare_and_write": false, 00:26:07.882 "abort": false, 00:26:07.882 "seek_hole": false, 00:26:07.882 "seek_data": false, 00:26:07.882 "copy": false, 00:26:07.882 "nvme_iov_md": false 00:26:07.882 }, 00:26:07.882 "memory_domains": [ 00:26:07.882 { 00:26:07.882 "dma_device_id": "system", 00:26:07.882 "dma_device_type": 1 00:26:07.882 }, 00:26:07.882 { 00:26:07.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:07.882 "dma_device_type": 2 00:26:07.882 }, 00:26:07.882 { 00:26:07.882 "dma_device_id": "system", 00:26:07.882 "dma_device_type": 1 00:26:07.882 }, 00:26:07.882 { 00:26:07.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:07.882 "dma_device_type": 2 00:26:07.882 } 00:26:07.882 ], 00:26:07.882 "driver_specific": { 00:26:07.882 "raid": { 00:26:07.882 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:07.882 "strip_size_kb": 0, 00:26:07.882 "state": "online", 00:26:07.882 "raid_level": "raid1", 00:26:07.882 "superblock": true, 00:26:07.882 "num_base_bdevs": 2, 00:26:07.882 "num_base_bdevs_discovered": 2, 00:26:07.882 "num_base_bdevs_operational": 2, 00:26:07.882 "base_bdevs_list": [ 00:26:07.882 { 00:26:07.882 "name": "pt1", 00:26:07.882 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:07.882 "is_configured": true, 00:26:07.882 "data_offset": 256, 00:26:07.882 "data_size": 7936 00:26:07.882 }, 00:26:07.882 { 00:26:07.882 "name": "pt2", 00:26:07.882 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:07.882 "is_configured": true, 00:26:07.882 "data_offset": 256, 00:26:07.882 "data_size": 7936 00:26:07.882 } 00:26:07.882 ] 00:26:07.882 } 00:26:07.882 } 00:26:07.882 }' 00:26:07.882 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:08.141 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:08.141 pt2' 00:26:08.141 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:08.141 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:08.141 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:08.141 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:08.141 "name": "pt1", 00:26:08.141 "aliases": [ 00:26:08.141 "00000000-0000-0000-0000-000000000001" 00:26:08.141 ], 00:26:08.141 "product_name": "passthru", 00:26:08.141 "block_size": 4128, 00:26:08.141 "num_blocks": 8192, 00:26:08.141 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:08.141 "md_size": 32, 00:26:08.141 "md_interleave": true, 00:26:08.141 "dif_type": 0, 00:26:08.141 "assigned_rate_limits": { 00:26:08.141 "rw_ios_per_sec": 0, 00:26:08.141 "rw_mbytes_per_sec": 0, 00:26:08.141 "r_mbytes_per_sec": 0, 00:26:08.141 "w_mbytes_per_sec": 0 00:26:08.141 }, 00:26:08.141 "claimed": true, 00:26:08.141 "claim_type": "exclusive_write", 00:26:08.141 "zoned": false, 00:26:08.141 "supported_io_types": { 00:26:08.141 "read": true, 00:26:08.141 "write": true, 00:26:08.141 "unmap": true, 00:26:08.141 "flush": true, 00:26:08.141 "reset": true, 00:26:08.141 "nvme_admin": false, 00:26:08.141 "nvme_io": false, 00:26:08.141 "nvme_io_md": false, 00:26:08.141 "write_zeroes": true, 00:26:08.141 "zcopy": true, 00:26:08.141 "get_zone_info": false, 00:26:08.141 "zone_management": false, 00:26:08.141 "zone_append": false, 00:26:08.141 "compare": false, 00:26:08.141 "compare_and_write": false, 00:26:08.141 "abort": true, 00:26:08.141 "seek_hole": false, 00:26:08.141 "seek_data": false, 00:26:08.141 "copy": true, 00:26:08.141 "nvme_iov_md": false 00:26:08.141 }, 00:26:08.141 "memory_domains": [ 00:26:08.141 { 00:26:08.141 "dma_device_id": "system", 00:26:08.141 "dma_device_type": 1 00:26:08.141 }, 00:26:08.141 { 00:26:08.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:08.141 "dma_device_type": 2 00:26:08.141 } 00:26:08.141 ], 00:26:08.141 "driver_specific": { 00:26:08.141 "passthru": { 00:26:08.141 "name": "pt1", 00:26:08.141 "base_bdev_name": "malloc1" 00:26:08.141 } 00:26:08.141 } 00:26:08.141 }' 00:26:08.141 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:08.141 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:08.401 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:08.401 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:08.401 08:01:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:08.401 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:08.401 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:08.401 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:08.401 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:08.401 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:08.401 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:08.666 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:08.666 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:08.666 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:08.666 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:08.666 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:08.666 "name": "pt2", 00:26:08.666 "aliases": [ 00:26:08.666 "00000000-0000-0000-0000-000000000002" 00:26:08.666 ], 00:26:08.666 "product_name": "passthru", 00:26:08.666 "block_size": 4128, 00:26:08.666 "num_blocks": 8192, 00:26:08.666 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:08.666 "md_size": 32, 00:26:08.666 "md_interleave": true, 00:26:08.666 "dif_type": 0, 00:26:08.666 "assigned_rate_limits": { 00:26:08.666 "rw_ios_per_sec": 0, 00:26:08.666 "rw_mbytes_per_sec": 0, 00:26:08.666 "r_mbytes_per_sec": 0, 00:26:08.666 "w_mbytes_per_sec": 0 00:26:08.666 }, 00:26:08.666 "claimed": true, 00:26:08.666 "claim_type": "exclusive_write", 00:26:08.666 "zoned": false, 00:26:08.666 "supported_io_types": { 00:26:08.666 "read": true, 00:26:08.666 "write": true, 00:26:08.666 "unmap": true, 00:26:08.666 "flush": true, 00:26:08.666 "reset": true, 00:26:08.666 "nvme_admin": false, 00:26:08.666 "nvme_io": false, 00:26:08.666 "nvme_io_md": false, 00:26:08.666 "write_zeroes": true, 00:26:08.666 "zcopy": true, 00:26:08.666 "get_zone_info": false, 00:26:08.666 "zone_management": false, 00:26:08.666 "zone_append": false, 00:26:08.666 "compare": false, 00:26:08.666 "compare_and_write": false, 00:26:08.666 "abort": true, 00:26:08.666 "seek_hole": false, 00:26:08.666 "seek_data": false, 00:26:08.666 "copy": true, 00:26:08.666 "nvme_iov_md": false 00:26:08.666 }, 00:26:08.666 "memory_domains": [ 00:26:08.666 { 00:26:08.666 "dma_device_id": "system", 00:26:08.666 "dma_device_type": 1 00:26:08.666 }, 00:26:08.666 { 00:26:08.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:08.666 "dma_device_type": 2 00:26:08.666 } 00:26:08.666 ], 00:26:08.666 "driver_specific": { 00:26:08.666 "passthru": { 00:26:08.666 "name": "pt2", 00:26:08.666 "base_bdev_name": "malloc2" 00:26:08.666 } 00:26:08.666 } 00:26:08.666 }' 00:26:08.666 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:08.666 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:08.924 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:08.925 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:09.183 [2024-07-15 08:01:53.848769] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:09.183 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=cb4c008c-2207-4bb3-b10d-4c385c8ea58f 00:26:09.183 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z cb4c008c-2207-4bb3-b10d-4c385c8ea58f ']' 00:26:09.183 08:01:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:09.443 [2024-07-15 08:01:54.041051] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:09.443 [2024-07-15 08:01:54.041065] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:09.443 [2024-07-15 08:01:54.041103] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:09.443 [2024-07-15 08:01:54.041141] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:09.443 [2024-07-15 08:01:54.041147] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f5e480 name raid_bdev1, state offline 00:26:09.443 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:09.443 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.702 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:09.702 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:09.702 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:09.702 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:09.702 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:09.702 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:09.962 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:09.962 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:10.223 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:10.483 [2024-07-15 08:01:54.979389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:10.483 [2024-07-15 08:01:54.980474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:10.483 [2024-07-15 08:01:54.980516] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:10.483 [2024-07-15 08:01:54.980542] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:10.483 [2024-07-15 08:01:54.980553] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:10.483 [2024-07-15 08:01:54.980558] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f5b9c0 name raid_bdev1, state configuring 00:26:10.483 request: 00:26:10.483 { 00:26:10.483 "name": "raid_bdev1", 00:26:10.483 "raid_level": "raid1", 00:26:10.483 "base_bdevs": [ 00:26:10.483 "malloc1", 00:26:10.483 "malloc2" 00:26:10.483 ], 00:26:10.483 "superblock": false, 00:26:10.483 "method": "bdev_raid_create", 00:26:10.483 "req_id": 1 00:26:10.483 } 00:26:10.483 Got JSON-RPC error response 00:26:10.483 response: 00:26:10.483 { 00:26:10.483 "code": -17, 00:26:10.483 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:10.483 } 00:26:10.483 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:26:10.483 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:10.483 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:10.483 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:10.483 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.483 08:01:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:10.483 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:10.483 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:10.483 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:10.742 [2024-07-15 08:01:55.352288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:10.742 [2024-07-15 08:01:55.352313] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:10.742 [2024-07-15 08:01:55.352326] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dcd360 00:26:10.742 [2024-07-15 08:01:55.352333] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:10.742 [2024-07-15 08:01:55.353436] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:10.743 [2024-07-15 08:01:55.353454] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:10.743 [2024-07-15 08:01:55.353483] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:10.743 [2024-07-15 08:01:55.353499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:10.743 pt1 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.743 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.002 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:11.002 "name": "raid_bdev1", 00:26:11.002 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:11.002 "strip_size_kb": 0, 00:26:11.002 "state": "configuring", 00:26:11.002 "raid_level": "raid1", 00:26:11.002 "superblock": true, 00:26:11.002 "num_base_bdevs": 2, 00:26:11.002 "num_base_bdevs_discovered": 1, 00:26:11.002 "num_base_bdevs_operational": 2, 00:26:11.002 "base_bdevs_list": [ 00:26:11.002 { 00:26:11.002 "name": "pt1", 00:26:11.002 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:11.002 "is_configured": true, 00:26:11.002 "data_offset": 256, 00:26:11.002 "data_size": 7936 00:26:11.002 }, 00:26:11.002 { 00:26:11.002 "name": null, 00:26:11.002 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:11.002 "is_configured": false, 00:26:11.002 "data_offset": 256, 00:26:11.002 "data_size": 7936 00:26:11.002 } 00:26:11.002 ] 00:26:11.002 }' 00:26:11.002 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:11.002 08:01:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:11.571 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:11.571 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:11.571 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:11.571 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:11.571 [2024-07-15 08:01:56.286655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:11.571 [2024-07-15 08:01:56.286681] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:11.571 [2024-07-15 08:01:56.286691] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f615f0 00:26:11.571 [2024-07-15 08:01:56.286698] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:11.571 [2024-07-15 08:01:56.286810] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:11.572 [2024-07-15 08:01:56.286819] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:11.572 [2024-07-15 08:01:56.286843] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:11.572 [2024-07-15 08:01:56.286854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:11.572 [2024-07-15 08:01:56.286915] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f5fea0 00:26:11.572 [2024-07-15 08:01:56.286921] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:11.572 [2024-07-15 08:01:56.286958] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f60ec0 00:26:11.572 [2024-07-15 08:01:56.287015] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f5fea0 00:26:11.572 [2024-07-15 08:01:56.287021] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f5fea0 00:26:11.572 [2024-07-15 08:01:56.287062] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:11.572 pt2 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.572 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.832 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:11.832 "name": "raid_bdev1", 00:26:11.832 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:11.832 "strip_size_kb": 0, 00:26:11.832 "state": "online", 00:26:11.832 "raid_level": "raid1", 00:26:11.832 "superblock": true, 00:26:11.832 "num_base_bdevs": 2, 00:26:11.832 "num_base_bdevs_discovered": 2, 00:26:11.832 "num_base_bdevs_operational": 2, 00:26:11.832 "base_bdevs_list": [ 00:26:11.832 { 00:26:11.832 "name": "pt1", 00:26:11.832 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:11.832 "is_configured": true, 00:26:11.832 "data_offset": 256, 00:26:11.832 "data_size": 7936 00:26:11.832 }, 00:26:11.832 { 00:26:11.832 "name": "pt2", 00:26:11.832 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:11.832 "is_configured": true, 00:26:11.832 "data_offset": 256, 00:26:11.832 "data_size": 7936 00:26:11.832 } 00:26:11.832 ] 00:26:11.832 }' 00:26:11.832 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:11.832 08:01:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:12.402 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:12.402 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:12.402 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:12.402 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:12.402 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:12.402 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:26:12.402 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:12.402 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:12.661 [2024-07-15 08:01:57.237261] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:12.661 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:12.661 "name": "raid_bdev1", 00:26:12.661 "aliases": [ 00:26:12.661 "cb4c008c-2207-4bb3-b10d-4c385c8ea58f" 00:26:12.661 ], 00:26:12.661 "product_name": "Raid Volume", 00:26:12.661 "block_size": 4128, 00:26:12.661 "num_blocks": 7936, 00:26:12.661 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:12.661 "md_size": 32, 00:26:12.661 "md_interleave": true, 00:26:12.661 "dif_type": 0, 00:26:12.661 "assigned_rate_limits": { 00:26:12.661 "rw_ios_per_sec": 0, 00:26:12.661 "rw_mbytes_per_sec": 0, 00:26:12.661 "r_mbytes_per_sec": 0, 00:26:12.661 "w_mbytes_per_sec": 0 00:26:12.661 }, 00:26:12.661 "claimed": false, 00:26:12.661 "zoned": false, 00:26:12.661 "supported_io_types": { 00:26:12.661 "read": true, 00:26:12.661 "write": true, 00:26:12.661 "unmap": false, 00:26:12.661 "flush": false, 00:26:12.661 "reset": true, 00:26:12.661 "nvme_admin": false, 00:26:12.661 "nvme_io": false, 00:26:12.661 "nvme_io_md": false, 00:26:12.661 "write_zeroes": true, 00:26:12.661 "zcopy": false, 00:26:12.661 "get_zone_info": false, 00:26:12.661 "zone_management": false, 00:26:12.661 "zone_append": false, 00:26:12.661 "compare": false, 00:26:12.661 "compare_and_write": false, 00:26:12.661 "abort": false, 00:26:12.661 "seek_hole": false, 00:26:12.661 "seek_data": false, 00:26:12.661 "copy": false, 00:26:12.661 "nvme_iov_md": false 00:26:12.661 }, 00:26:12.661 "memory_domains": [ 00:26:12.661 { 00:26:12.661 "dma_device_id": "system", 00:26:12.661 "dma_device_type": 1 00:26:12.661 }, 00:26:12.661 { 00:26:12.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:12.661 "dma_device_type": 2 00:26:12.661 }, 00:26:12.661 { 00:26:12.661 "dma_device_id": "system", 00:26:12.661 "dma_device_type": 1 00:26:12.661 }, 00:26:12.661 { 00:26:12.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:12.661 "dma_device_type": 2 00:26:12.661 } 00:26:12.661 ], 00:26:12.661 "driver_specific": { 00:26:12.662 "raid": { 00:26:12.662 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:12.662 "strip_size_kb": 0, 00:26:12.662 "state": "online", 00:26:12.662 "raid_level": "raid1", 00:26:12.662 "superblock": true, 00:26:12.662 "num_base_bdevs": 2, 00:26:12.662 "num_base_bdevs_discovered": 2, 00:26:12.662 "num_base_bdevs_operational": 2, 00:26:12.662 "base_bdevs_list": [ 00:26:12.662 { 00:26:12.662 "name": "pt1", 00:26:12.662 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:12.662 "is_configured": true, 00:26:12.662 "data_offset": 256, 00:26:12.662 "data_size": 7936 00:26:12.662 }, 00:26:12.662 { 00:26:12.662 "name": "pt2", 00:26:12.662 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:12.662 "is_configured": true, 00:26:12.662 "data_offset": 256, 00:26:12.662 "data_size": 7936 00:26:12.662 } 00:26:12.662 ] 00:26:12.662 } 00:26:12.662 } 00:26:12.662 }' 00:26:12.662 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:12.662 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:12.662 pt2' 00:26:12.662 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:12.662 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:12.662 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:12.920 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:12.920 "name": "pt1", 00:26:12.920 "aliases": [ 00:26:12.920 "00000000-0000-0000-0000-000000000001" 00:26:12.920 ], 00:26:12.920 "product_name": "passthru", 00:26:12.920 "block_size": 4128, 00:26:12.920 "num_blocks": 8192, 00:26:12.920 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:12.920 "md_size": 32, 00:26:12.920 "md_interleave": true, 00:26:12.920 "dif_type": 0, 00:26:12.920 "assigned_rate_limits": { 00:26:12.920 "rw_ios_per_sec": 0, 00:26:12.920 "rw_mbytes_per_sec": 0, 00:26:12.920 "r_mbytes_per_sec": 0, 00:26:12.920 "w_mbytes_per_sec": 0 00:26:12.920 }, 00:26:12.920 "claimed": true, 00:26:12.920 "claim_type": "exclusive_write", 00:26:12.920 "zoned": false, 00:26:12.920 "supported_io_types": { 00:26:12.920 "read": true, 00:26:12.920 "write": true, 00:26:12.920 "unmap": true, 00:26:12.920 "flush": true, 00:26:12.920 "reset": true, 00:26:12.920 "nvme_admin": false, 00:26:12.920 "nvme_io": false, 00:26:12.920 "nvme_io_md": false, 00:26:12.920 "write_zeroes": true, 00:26:12.920 "zcopy": true, 00:26:12.920 "get_zone_info": false, 00:26:12.920 "zone_management": false, 00:26:12.920 "zone_append": false, 00:26:12.920 "compare": false, 00:26:12.920 "compare_and_write": false, 00:26:12.920 "abort": true, 00:26:12.920 "seek_hole": false, 00:26:12.920 "seek_data": false, 00:26:12.920 "copy": true, 00:26:12.920 "nvme_iov_md": false 00:26:12.920 }, 00:26:12.920 "memory_domains": [ 00:26:12.920 { 00:26:12.920 "dma_device_id": "system", 00:26:12.920 "dma_device_type": 1 00:26:12.920 }, 00:26:12.920 { 00:26:12.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:12.920 "dma_device_type": 2 00:26:12.920 } 00:26:12.920 ], 00:26:12.920 "driver_specific": { 00:26:12.920 "passthru": { 00:26:12.920 "name": "pt1", 00:26:12.920 "base_bdev_name": "malloc1" 00:26:12.920 } 00:26:12.920 } 00:26:12.920 }' 00:26:12.920 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:12.920 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:12.920 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:12.920 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:12.920 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:12.920 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:13.179 08:01:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:13.438 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:13.438 "name": "pt2", 00:26:13.438 "aliases": [ 00:26:13.438 "00000000-0000-0000-0000-000000000002" 00:26:13.438 ], 00:26:13.438 "product_name": "passthru", 00:26:13.438 "block_size": 4128, 00:26:13.438 "num_blocks": 8192, 00:26:13.438 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:13.438 "md_size": 32, 00:26:13.438 "md_interleave": true, 00:26:13.438 "dif_type": 0, 00:26:13.438 "assigned_rate_limits": { 00:26:13.438 "rw_ios_per_sec": 0, 00:26:13.438 "rw_mbytes_per_sec": 0, 00:26:13.438 "r_mbytes_per_sec": 0, 00:26:13.438 "w_mbytes_per_sec": 0 00:26:13.438 }, 00:26:13.438 "claimed": true, 00:26:13.438 "claim_type": "exclusive_write", 00:26:13.438 "zoned": false, 00:26:13.438 "supported_io_types": { 00:26:13.438 "read": true, 00:26:13.438 "write": true, 00:26:13.438 "unmap": true, 00:26:13.438 "flush": true, 00:26:13.438 "reset": true, 00:26:13.438 "nvme_admin": false, 00:26:13.438 "nvme_io": false, 00:26:13.438 "nvme_io_md": false, 00:26:13.438 "write_zeroes": true, 00:26:13.438 "zcopy": true, 00:26:13.438 "get_zone_info": false, 00:26:13.438 "zone_management": false, 00:26:13.438 "zone_append": false, 00:26:13.438 "compare": false, 00:26:13.438 "compare_and_write": false, 00:26:13.438 "abort": true, 00:26:13.438 "seek_hole": false, 00:26:13.438 "seek_data": false, 00:26:13.438 "copy": true, 00:26:13.438 "nvme_iov_md": false 00:26:13.438 }, 00:26:13.438 "memory_domains": [ 00:26:13.438 { 00:26:13.438 "dma_device_id": "system", 00:26:13.438 "dma_device_type": 1 00:26:13.438 }, 00:26:13.438 { 00:26:13.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:13.438 "dma_device_type": 2 00:26:13.438 } 00:26:13.438 ], 00:26:13.438 "driver_specific": { 00:26:13.438 "passthru": { 00:26:13.438 "name": "pt2", 00:26:13.438 "base_bdev_name": "malloc2" 00:26:13.438 } 00:26:13.438 } 00:26:13.438 }' 00:26:13.438 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:13.438 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:13.438 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:26:13.438 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:13.438 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:13.698 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:13.957 [2024-07-15 08:01:58.556585] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:13.957 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' cb4c008c-2207-4bb3-b10d-4c385c8ea58f '!=' cb4c008c-2207-4bb3-b10d-4c385c8ea58f ']' 00:26:13.957 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:13.957 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:13.957 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:26:13.957 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:14.216 [2024-07-15 08:01:58.748893] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:14.216 "name": "raid_bdev1", 00:26:14.216 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:14.216 "strip_size_kb": 0, 00:26:14.216 "state": "online", 00:26:14.216 "raid_level": "raid1", 00:26:14.216 "superblock": true, 00:26:14.216 "num_base_bdevs": 2, 00:26:14.216 "num_base_bdevs_discovered": 1, 00:26:14.216 "num_base_bdevs_operational": 1, 00:26:14.216 "base_bdevs_list": [ 00:26:14.216 { 00:26:14.216 "name": null, 00:26:14.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.216 "is_configured": false, 00:26:14.216 "data_offset": 256, 00:26:14.216 "data_size": 7936 00:26:14.216 }, 00:26:14.216 { 00:26:14.216 "name": "pt2", 00:26:14.216 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:14.216 "is_configured": true, 00:26:14.216 "data_offset": 256, 00:26:14.216 "data_size": 7936 00:26:14.216 } 00:26:14.216 ] 00:26:14.216 }' 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:14.216 08:01:58 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:14.786 08:01:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:15.064 [2024-07-15 08:01:59.667209] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:15.064 [2024-07-15 08:01:59.667224] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:15.064 [2024-07-15 08:01:59.667255] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:15.064 [2024-07-15 08:01:59.667286] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:15.064 [2024-07-15 08:01:59.667292] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f5fea0 name raid_bdev1, state offline 00:26:15.064 08:01:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.064 08:01:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:15.360 08:01:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:15.360 08:01:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:15.360 08:01:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:15.360 08:01:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:15.360 08:01:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:15.360 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:15.360 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:15.360 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:15.360 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:15.360 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:26:15.360 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:15.619 [2024-07-15 08:02:00.248716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:15.619 [2024-07-15 08:02:00.248749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:15.619 [2024-07-15 08:02:00.248761] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dcd6d0 00:26:15.619 [2024-07-15 08:02:00.248767] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:15.619 [2024-07-15 08:02:00.249877] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:15.619 [2024-07-15 08:02:00.249894] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:15.619 [2024-07-15 08:02:00.249924] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:15.619 [2024-07-15 08:02:00.249941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:15.619 [2024-07-15 08:02:00.249998] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f60fd0 00:26:15.619 [2024-07-15 08:02:00.250004] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:15.619 [2024-07-15 08:02:00.250042] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dcf970 00:26:15.619 [2024-07-15 08:02:00.250097] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f60fd0 00:26:15.619 [2024-07-15 08:02:00.250102] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f60fd0 00:26:15.619 [2024-07-15 08:02:00.250140] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:15.619 pt2 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.619 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.879 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:15.879 "name": "raid_bdev1", 00:26:15.879 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:15.879 "strip_size_kb": 0, 00:26:15.879 "state": "online", 00:26:15.879 "raid_level": "raid1", 00:26:15.879 "superblock": true, 00:26:15.879 "num_base_bdevs": 2, 00:26:15.879 "num_base_bdevs_discovered": 1, 00:26:15.879 "num_base_bdevs_operational": 1, 00:26:15.879 "base_bdevs_list": [ 00:26:15.879 { 00:26:15.879 "name": null, 00:26:15.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.879 "is_configured": false, 00:26:15.879 "data_offset": 256, 00:26:15.879 "data_size": 7936 00:26:15.879 }, 00:26:15.879 { 00:26:15.879 "name": "pt2", 00:26:15.879 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:15.879 "is_configured": true, 00:26:15.879 "data_offset": 256, 00:26:15.879 "data_size": 7936 00:26:15.879 } 00:26:15.879 ] 00:26:15.879 }' 00:26:15.879 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:15.879 08:02:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:16.463 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:16.463 [2024-07-15 08:02:01.203129] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:16.463 [2024-07-15 08:02:01.203144] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:16.463 [2024-07-15 08:02:01.203175] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:16.463 [2024-07-15 08:02:01.203202] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:16.463 [2024-07-15 08:02:01.203208] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f60fd0 name raid_bdev1, state offline 00:26:16.723 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.723 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:16.723 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:16.723 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:16.723 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:16.723 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:16.982 [2024-07-15 08:02:01.588096] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:16.982 [2024-07-15 08:02:01.588121] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.982 [2024-07-15 08:02:01.588130] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f5d530 00:26:16.982 [2024-07-15 08:02:01.588136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.982 [2024-07-15 08:02:01.589245] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.983 [2024-07-15 08:02:01.589263] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:16.983 [2024-07-15 08:02:01.589294] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:16.983 [2024-07-15 08:02:01.589311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:16.983 [2024-07-15 08:02:01.589370] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:16.983 [2024-07-15 08:02:01.589377] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:16.983 [2024-07-15 08:02:01.589385] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f60c20 name raid_bdev1, state configuring 00:26:16.983 [2024-07-15 08:02:01.589399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:16.983 [2024-07-15 08:02:01.589437] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f61820 00:26:16.983 [2024-07-15 08:02:01.589443] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:16.983 [2024-07-15 08:02:01.589479] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dcefa0 00:26:16.983 [2024-07-15 08:02:01.589533] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f61820 00:26:16.983 [2024-07-15 08:02:01.589538] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f61820 00:26:16.983 [2024-07-15 08:02:01.589584] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:16.983 pt1 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.983 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.242 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.242 "name": "raid_bdev1", 00:26:17.242 "uuid": "cb4c008c-2207-4bb3-b10d-4c385c8ea58f", 00:26:17.242 "strip_size_kb": 0, 00:26:17.242 "state": "online", 00:26:17.242 "raid_level": "raid1", 00:26:17.242 "superblock": true, 00:26:17.242 "num_base_bdevs": 2, 00:26:17.242 "num_base_bdevs_discovered": 1, 00:26:17.242 "num_base_bdevs_operational": 1, 00:26:17.242 "base_bdevs_list": [ 00:26:17.242 { 00:26:17.242 "name": null, 00:26:17.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.242 "is_configured": false, 00:26:17.242 "data_offset": 256, 00:26:17.242 "data_size": 7936 00:26:17.242 }, 00:26:17.242 { 00:26:17.242 "name": "pt2", 00:26:17.242 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:17.242 "is_configured": true, 00:26:17.242 "data_offset": 256, 00:26:17.242 "data_size": 7936 00:26:17.242 } 00:26:17.242 ] 00:26:17.242 }' 00:26:17.242 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.242 08:02:01 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:17.810 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:17.810 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:17.810 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:17.810 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:17.810 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:18.070 [2024-07-15 08:02:02.703098] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' cb4c008c-2207-4bb3-b10d-4c385c8ea58f '!=' cb4c008c-2207-4bb3-b10d-4c385c8ea58f ']' 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 1762823 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1762823 ']' 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1762823 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1762823 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1762823' 00:26:18.070 killing process with pid 1762823 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 1762823 00:26:18.070 [2024-07-15 08:02:02.770551] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:18.070 [2024-07-15 08:02:02.770586] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:18.070 [2024-07-15 08:02:02.770615] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:18.070 [2024-07-15 08:02:02.770621] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f61820 name raid_bdev1, state offline 00:26:18.070 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 1762823 00:26:18.070 [2024-07-15 08:02:02.780022] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:18.331 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:26:18.331 00:26:18.331 real 0m13.065s 00:26:18.331 user 0m24.188s 00:26:18.331 sys 0m1.981s 00:26:18.331 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:18.331 08:02:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:18.331 ************************************ 00:26:18.331 END TEST raid_superblock_test_md_interleaved 00:26:18.331 ************************************ 00:26:18.331 08:02:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:18.331 08:02:02 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:26:18.331 08:02:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:18.331 08:02:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:18.331 08:02:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:18.331 ************************************ 00:26:18.331 START TEST raid_rebuild_test_sb_md_interleaved 00:26:18.331 ************************************ 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=1765328 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 1765328 /var/tmp/spdk-raid.sock 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1765328 ']' 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:18.331 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:18.331 08:02:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:18.331 [2024-07-15 08:02:03.040635] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:18.331 [2024-07-15 08:02:03.040684] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1765328 ] 00:26:18.331 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:18.331 Zero copy mechanism will not be used. 00:26:18.591 [2024-07-15 08:02:03.131099] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:18.591 [2024-07-15 08:02:03.197737] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:18.591 [2024-07-15 08:02:03.238474] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:18.591 [2024-07-15 08:02:03.238495] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:19.160 08:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:19.160 08:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:26:19.160 08:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:19.160 08:02:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:26:19.420 BaseBdev1_malloc 00:26:19.420 08:02:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:19.680 [2024-07-15 08:02:04.276458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:19.680 [2024-07-15 08:02:04.276491] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:19.680 [2024-07-15 08:02:04.276504] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2786680 00:26:19.680 [2024-07-15 08:02:04.276511] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:19.680 [2024-07-15 08:02:04.277666] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:19.680 [2024-07-15 08:02:04.277684] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:19.680 BaseBdev1 00:26:19.680 08:02:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:19.680 08:02:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:26:19.941 BaseBdev2_malloc 00:26:19.941 08:02:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:19.941 [2024-07-15 08:02:04.659717] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:19.941 [2024-07-15 08:02:04.659744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:19.941 [2024-07-15 08:02:04.659756] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2906d30 00:26:19.941 [2024-07-15 08:02:04.659767] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:19.941 [2024-07-15 08:02:04.660876] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:19.941 [2024-07-15 08:02:04.660893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:19.941 BaseBdev2 00:26:19.941 08:02:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:26:20.201 spare_malloc 00:26:20.201 08:02:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:20.460 spare_delay 00:26:20.460 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:20.720 [2024-07-15 08:02:05.231365] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:20.721 [2024-07-15 08:02:05.231395] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.721 [2024-07-15 08:02:05.231408] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2907600 00:26:20.721 [2024-07-15 08:02:05.231415] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.721 [2024-07-15 08:02:05.232482] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.721 [2024-07-15 08:02:05.232501] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:20.721 spare 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:20.721 [2024-07-15 08:02:05.407829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:20.721 [2024-07-15 08:02:05.408817] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:20.721 [2024-07-15 08:02:05.408933] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2911e50 00:26:20.721 [2024-07-15 08:02:05.408941] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:20.721 [2024-07-15 08:02:05.408988] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2787110 00:26:20.721 [2024-07-15 08:02:05.409055] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2911e50 00:26:20.721 [2024-07-15 08:02:05.409061] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2911e50 00:26:20.721 [2024-07-15 08:02:05.409100] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.721 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.981 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:20.981 "name": "raid_bdev1", 00:26:20.981 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:20.981 "strip_size_kb": 0, 00:26:20.981 "state": "online", 00:26:20.981 "raid_level": "raid1", 00:26:20.981 "superblock": true, 00:26:20.981 "num_base_bdevs": 2, 00:26:20.981 "num_base_bdevs_discovered": 2, 00:26:20.981 "num_base_bdevs_operational": 2, 00:26:20.981 "base_bdevs_list": [ 00:26:20.981 { 00:26:20.981 "name": "BaseBdev1", 00:26:20.981 "uuid": "f095add3-51d5-5684-a4d5-0d6f9b2374d9", 00:26:20.981 "is_configured": true, 00:26:20.981 "data_offset": 256, 00:26:20.981 "data_size": 7936 00:26:20.981 }, 00:26:20.981 { 00:26:20.981 "name": "BaseBdev2", 00:26:20.981 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:20.981 "is_configured": true, 00:26:20.981 "data_offset": 256, 00:26:20.981 "data_size": 7936 00:26:20.981 } 00:26:20.981 ] 00:26:20.981 }' 00:26:20.981 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:20.981 08:02:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:21.552 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:21.552 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:21.811 [2024-07-15 08:02:06.310286] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:21.811 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:26:21.811 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.811 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:21.811 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:26:21.811 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:21.811 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:26:21.811 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:22.071 [2024-07-15 08:02:06.695045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.071 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.332 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.332 "name": "raid_bdev1", 00:26:22.332 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:22.332 "strip_size_kb": 0, 00:26:22.332 "state": "online", 00:26:22.332 "raid_level": "raid1", 00:26:22.332 "superblock": true, 00:26:22.332 "num_base_bdevs": 2, 00:26:22.332 "num_base_bdevs_discovered": 1, 00:26:22.332 "num_base_bdevs_operational": 1, 00:26:22.332 "base_bdevs_list": [ 00:26:22.332 { 00:26:22.332 "name": null, 00:26:22.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.332 "is_configured": false, 00:26:22.332 "data_offset": 256, 00:26:22.332 "data_size": 7936 00:26:22.332 }, 00:26:22.332 { 00:26:22.332 "name": "BaseBdev2", 00:26:22.332 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:22.332 "is_configured": true, 00:26:22.332 "data_offset": 256, 00:26:22.332 "data_size": 7936 00:26:22.332 } 00:26:22.332 ] 00:26:22.332 }' 00:26:22.332 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.332 08:02:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:22.900 08:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:22.900 [2024-07-15 08:02:07.633508] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:22.900 [2024-07-15 08:02:07.635991] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x277e980 00:26:22.900 [2024-07-15 08:02:07.637512] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:22.900 08:02:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:24.281 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:24.281 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:24.281 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:24.281 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:24.281 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:24.281 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.281 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.281 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:24.281 "name": "raid_bdev1", 00:26:24.281 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:24.281 "strip_size_kb": 0, 00:26:24.281 "state": "online", 00:26:24.281 "raid_level": "raid1", 00:26:24.281 "superblock": true, 00:26:24.281 "num_base_bdevs": 2, 00:26:24.281 "num_base_bdevs_discovered": 2, 00:26:24.281 "num_base_bdevs_operational": 2, 00:26:24.281 "process": { 00:26:24.281 "type": "rebuild", 00:26:24.281 "target": "spare", 00:26:24.281 "progress": { 00:26:24.281 "blocks": 2816, 00:26:24.281 "percent": 35 00:26:24.281 } 00:26:24.281 }, 00:26:24.281 "base_bdevs_list": [ 00:26:24.281 { 00:26:24.281 "name": "spare", 00:26:24.281 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:24.281 "is_configured": true, 00:26:24.281 "data_offset": 256, 00:26:24.281 "data_size": 7936 00:26:24.281 }, 00:26:24.281 { 00:26:24.281 "name": "BaseBdev2", 00:26:24.281 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:24.281 "is_configured": true, 00:26:24.281 "data_offset": 256, 00:26:24.281 "data_size": 7936 00:26:24.282 } 00:26:24.282 ] 00:26:24.282 }' 00:26:24.282 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:24.282 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:24.282 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:24.282 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:24.282 08:02:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:24.542 [2024-07-15 08:02:09.074121] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:24.542 [2024-07-15 08:02:09.146306] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:24.542 [2024-07-15 08:02:09.146339] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:24.542 [2024-07-15 08:02:09.146348] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:24.542 [2024-07-15 08:02:09.146353] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.542 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.801 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.801 "name": "raid_bdev1", 00:26:24.801 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:24.801 "strip_size_kb": 0, 00:26:24.801 "state": "online", 00:26:24.801 "raid_level": "raid1", 00:26:24.801 "superblock": true, 00:26:24.801 "num_base_bdevs": 2, 00:26:24.801 "num_base_bdevs_discovered": 1, 00:26:24.801 "num_base_bdevs_operational": 1, 00:26:24.801 "base_bdevs_list": [ 00:26:24.801 { 00:26:24.801 "name": null, 00:26:24.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.801 "is_configured": false, 00:26:24.801 "data_offset": 256, 00:26:24.801 "data_size": 7936 00:26:24.801 }, 00:26:24.801 { 00:26:24.801 "name": "BaseBdev2", 00:26:24.801 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:24.801 "is_configured": true, 00:26:24.801 "data_offset": 256, 00:26:24.801 "data_size": 7936 00:26:24.801 } 00:26:24.801 ] 00:26:24.801 }' 00:26:24.801 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.801 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:25.371 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:25.371 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.371 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:25.371 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:25.371 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.371 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.371 08:02:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.633 08:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.633 "name": "raid_bdev1", 00:26:25.633 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:25.633 "strip_size_kb": 0, 00:26:25.633 "state": "online", 00:26:25.633 "raid_level": "raid1", 00:26:25.633 "superblock": true, 00:26:25.633 "num_base_bdevs": 2, 00:26:25.633 "num_base_bdevs_discovered": 1, 00:26:25.633 "num_base_bdevs_operational": 1, 00:26:25.633 "base_bdevs_list": [ 00:26:25.633 { 00:26:25.633 "name": null, 00:26:25.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.633 "is_configured": false, 00:26:25.633 "data_offset": 256, 00:26:25.633 "data_size": 7936 00:26:25.633 }, 00:26:25.633 { 00:26:25.633 "name": "BaseBdev2", 00:26:25.633 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:25.633 "is_configured": true, 00:26:25.633 "data_offset": 256, 00:26:25.633 "data_size": 7936 00:26:25.633 } 00:26:25.633 ] 00:26:25.633 }' 00:26:25.633 08:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.633 08:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:25.633 08:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.633 08:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:25.633 08:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:25.892 [2024-07-15 08:02:10.413530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:25.892 [2024-07-15 08:02:10.415933] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x277e760 00:26:25.892 [2024-07-15 08:02:10.417060] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:25.892 08:02:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:26.831 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:26.831 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:26.831 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:26.831 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:26.831 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:26.831 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.831 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.091 "name": "raid_bdev1", 00:26:27.091 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:27.091 "strip_size_kb": 0, 00:26:27.091 "state": "online", 00:26:27.091 "raid_level": "raid1", 00:26:27.091 "superblock": true, 00:26:27.091 "num_base_bdevs": 2, 00:26:27.091 "num_base_bdevs_discovered": 2, 00:26:27.091 "num_base_bdevs_operational": 2, 00:26:27.091 "process": { 00:26:27.091 "type": "rebuild", 00:26:27.091 "target": "spare", 00:26:27.091 "progress": { 00:26:27.091 "blocks": 2816, 00:26:27.091 "percent": 35 00:26:27.091 } 00:26:27.091 }, 00:26:27.091 "base_bdevs_list": [ 00:26:27.091 { 00:26:27.091 "name": "spare", 00:26:27.091 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:27.091 "is_configured": true, 00:26:27.091 "data_offset": 256, 00:26:27.091 "data_size": 7936 00:26:27.091 }, 00:26:27.091 { 00:26:27.091 "name": "BaseBdev2", 00:26:27.091 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:27.091 "is_configured": true, 00:26:27.091 "data_offset": 256, 00:26:27.091 "data_size": 7936 00:26:27.091 } 00:26:27.091 ] 00:26:27.091 }' 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:27.091 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=997 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:27.091 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:27.092 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:27.092 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:27.092 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.092 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.352 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.352 "name": "raid_bdev1", 00:26:27.352 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:27.352 "strip_size_kb": 0, 00:26:27.352 "state": "online", 00:26:27.352 "raid_level": "raid1", 00:26:27.352 "superblock": true, 00:26:27.352 "num_base_bdevs": 2, 00:26:27.352 "num_base_bdevs_discovered": 2, 00:26:27.352 "num_base_bdevs_operational": 2, 00:26:27.352 "process": { 00:26:27.352 "type": "rebuild", 00:26:27.352 "target": "spare", 00:26:27.352 "progress": { 00:26:27.352 "blocks": 3584, 00:26:27.352 "percent": 45 00:26:27.352 } 00:26:27.352 }, 00:26:27.352 "base_bdevs_list": [ 00:26:27.352 { 00:26:27.352 "name": "spare", 00:26:27.352 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:27.352 "is_configured": true, 00:26:27.352 "data_offset": 256, 00:26:27.352 "data_size": 7936 00:26:27.352 }, 00:26:27.352 { 00:26:27.352 "name": "BaseBdev2", 00:26:27.352 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:27.352 "is_configured": true, 00:26:27.352 "data_offset": 256, 00:26:27.352 "data_size": 7936 00:26:27.352 } 00:26:27.352 ] 00:26:27.352 }' 00:26:27.352 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.352 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:27.352 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.352 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:27.352 08:02:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:28.291 08:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:28.291 08:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:28.291 08:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.291 08:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:28.291 08:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:28.291 08:02:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.291 08:02:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.291 08:02:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.551 08:02:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:28.551 "name": "raid_bdev1", 00:26:28.551 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:28.551 "strip_size_kb": 0, 00:26:28.551 "state": "online", 00:26:28.551 "raid_level": "raid1", 00:26:28.551 "superblock": true, 00:26:28.551 "num_base_bdevs": 2, 00:26:28.551 "num_base_bdevs_discovered": 2, 00:26:28.551 "num_base_bdevs_operational": 2, 00:26:28.551 "process": { 00:26:28.551 "type": "rebuild", 00:26:28.551 "target": "spare", 00:26:28.551 "progress": { 00:26:28.551 "blocks": 6912, 00:26:28.551 "percent": 87 00:26:28.551 } 00:26:28.551 }, 00:26:28.551 "base_bdevs_list": [ 00:26:28.551 { 00:26:28.551 "name": "spare", 00:26:28.551 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:28.551 "is_configured": true, 00:26:28.551 "data_offset": 256, 00:26:28.551 "data_size": 7936 00:26:28.551 }, 00:26:28.551 { 00:26:28.551 "name": "BaseBdev2", 00:26:28.551 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:28.552 "is_configured": true, 00:26:28.552 "data_offset": 256, 00:26:28.552 "data_size": 7936 00:26:28.552 } 00:26:28.552 ] 00:26:28.552 }' 00:26:28.552 08:02:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:28.552 08:02:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:28.552 08:02:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.552 08:02:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:28.552 08:02:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:28.810 [2024-07-15 08:02:13.535227] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:28.810 [2024-07-15 08:02:13.535270] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:28.810 [2024-07-15 08:02:13.535332] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:29.750 "name": "raid_bdev1", 00:26:29.750 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:29.750 "strip_size_kb": 0, 00:26:29.750 "state": "online", 00:26:29.750 "raid_level": "raid1", 00:26:29.750 "superblock": true, 00:26:29.750 "num_base_bdevs": 2, 00:26:29.750 "num_base_bdevs_discovered": 2, 00:26:29.750 "num_base_bdevs_operational": 2, 00:26:29.750 "base_bdevs_list": [ 00:26:29.750 { 00:26:29.750 "name": "spare", 00:26:29.750 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:29.750 "is_configured": true, 00:26:29.750 "data_offset": 256, 00:26:29.750 "data_size": 7936 00:26:29.750 }, 00:26:29.750 { 00:26:29.750 "name": "BaseBdev2", 00:26:29.750 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:29.750 "is_configured": true, 00:26:29.750 "data_offset": 256, 00:26:29.750 "data_size": 7936 00:26:29.750 } 00:26:29.750 ] 00:26:29.750 }' 00:26:29.750 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:30.010 "name": "raid_bdev1", 00:26:30.010 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:30.010 "strip_size_kb": 0, 00:26:30.010 "state": "online", 00:26:30.010 "raid_level": "raid1", 00:26:30.010 "superblock": true, 00:26:30.010 "num_base_bdevs": 2, 00:26:30.010 "num_base_bdevs_discovered": 2, 00:26:30.010 "num_base_bdevs_operational": 2, 00:26:30.010 "base_bdevs_list": [ 00:26:30.010 { 00:26:30.010 "name": "spare", 00:26:30.010 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:30.010 "is_configured": true, 00:26:30.010 "data_offset": 256, 00:26:30.010 "data_size": 7936 00:26:30.010 }, 00:26:30.010 { 00:26:30.010 "name": "BaseBdev2", 00:26:30.010 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:30.010 "is_configured": true, 00:26:30.010 "data_offset": 256, 00:26:30.010 "data_size": 7936 00:26:30.010 } 00:26:30.010 ] 00:26:30.010 }' 00:26:30.010 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.269 08:02:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.528 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:30.528 "name": "raid_bdev1", 00:26:30.528 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:30.528 "strip_size_kb": 0, 00:26:30.528 "state": "online", 00:26:30.528 "raid_level": "raid1", 00:26:30.528 "superblock": true, 00:26:30.528 "num_base_bdevs": 2, 00:26:30.528 "num_base_bdevs_discovered": 2, 00:26:30.528 "num_base_bdevs_operational": 2, 00:26:30.528 "base_bdevs_list": [ 00:26:30.528 { 00:26:30.528 "name": "spare", 00:26:30.528 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:30.528 "is_configured": true, 00:26:30.528 "data_offset": 256, 00:26:30.528 "data_size": 7936 00:26:30.528 }, 00:26:30.528 { 00:26:30.528 "name": "BaseBdev2", 00:26:30.528 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:30.528 "is_configured": true, 00:26:30.528 "data_offset": 256, 00:26:30.528 "data_size": 7936 00:26:30.528 } 00:26:30.528 ] 00:26:30.528 }' 00:26:30.528 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:30.528 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:31.097 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:31.097 [2024-07-15 08:02:15.743254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:31.097 [2024-07-15 08:02:15.743271] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:31.097 [2024-07-15 08:02:15.743310] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:31.097 [2024-07-15 08:02:15.743349] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:31.097 [2024-07-15 08:02:15.743355] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2911e50 name raid_bdev1, state offline 00:26:31.097 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.097 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:26:31.357 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:31.357 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:26:31.357 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:31.357 08:02:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:31.616 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:31.616 [2024-07-15 08:02:16.316670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:31.616 [2024-07-15 08:02:16.316696] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:31.616 [2024-07-15 08:02:16.316707] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2911bc0 00:26:31.616 [2024-07-15 08:02:16.316717] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:31.616 [2024-07-15 08:02:16.318070] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:31.616 [2024-07-15 08:02:16.318090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:31.616 [2024-07-15 08:02:16.318130] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:31.616 [2024-07-15 08:02:16.318149] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:31.616 [2024-07-15 08:02:16.318212] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:31.616 spare 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.617 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.877 [2024-07-15 08:02:16.418496] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x29139f0 00:26:31.877 [2024-07-15 08:02:16.418504] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:26:31.877 [2024-07-15 08:02:16.418553] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x277fae0 00:26:31.877 [2024-07-15 08:02:16.418618] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x29139f0 00:26:31.877 [2024-07-15 08:02:16.418623] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x29139f0 00:26:31.877 [2024-07-15 08:02:16.418666] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:31.877 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:31.877 "name": "raid_bdev1", 00:26:31.877 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:31.877 "strip_size_kb": 0, 00:26:31.877 "state": "online", 00:26:31.877 "raid_level": "raid1", 00:26:31.877 "superblock": true, 00:26:31.877 "num_base_bdevs": 2, 00:26:31.877 "num_base_bdevs_discovered": 2, 00:26:31.877 "num_base_bdevs_operational": 2, 00:26:31.877 "base_bdevs_list": [ 00:26:31.877 { 00:26:31.877 "name": "spare", 00:26:31.877 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:31.877 "is_configured": true, 00:26:31.877 "data_offset": 256, 00:26:31.877 "data_size": 7936 00:26:31.877 }, 00:26:31.877 { 00:26:31.877 "name": "BaseBdev2", 00:26:31.877 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:31.877 "is_configured": true, 00:26:31.877 "data_offset": 256, 00:26:31.877 "data_size": 7936 00:26:31.877 } 00:26:31.877 ] 00:26:31.877 }' 00:26:31.877 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:31.877 08:02:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:32.490 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:32.490 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:32.490 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:32.490 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:32.490 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:32.490 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.490 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.754 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:32.754 "name": "raid_bdev1", 00:26:32.754 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:32.754 "strip_size_kb": 0, 00:26:32.754 "state": "online", 00:26:32.754 "raid_level": "raid1", 00:26:32.754 "superblock": true, 00:26:32.754 "num_base_bdevs": 2, 00:26:32.754 "num_base_bdevs_discovered": 2, 00:26:32.754 "num_base_bdevs_operational": 2, 00:26:32.754 "base_bdevs_list": [ 00:26:32.754 { 00:26:32.754 "name": "spare", 00:26:32.754 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:32.754 "is_configured": true, 00:26:32.754 "data_offset": 256, 00:26:32.754 "data_size": 7936 00:26:32.754 }, 00:26:32.754 { 00:26:32.754 "name": "BaseBdev2", 00:26:32.754 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:32.754 "is_configured": true, 00:26:32.754 "data_offset": 256, 00:26:32.754 "data_size": 7936 00:26:32.754 } 00:26:32.754 ] 00:26:32.754 }' 00:26:32.754 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:32.754 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:32.755 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:32.755 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:32.755 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.755 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:33.014 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:33.015 [2024-07-15 08:02:17.736342] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.015 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.278 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:33.278 "name": "raid_bdev1", 00:26:33.278 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:33.278 "strip_size_kb": 0, 00:26:33.278 "state": "online", 00:26:33.278 "raid_level": "raid1", 00:26:33.278 "superblock": true, 00:26:33.278 "num_base_bdevs": 2, 00:26:33.278 "num_base_bdevs_discovered": 1, 00:26:33.278 "num_base_bdevs_operational": 1, 00:26:33.278 "base_bdevs_list": [ 00:26:33.278 { 00:26:33.278 "name": null, 00:26:33.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.278 "is_configured": false, 00:26:33.278 "data_offset": 256, 00:26:33.278 "data_size": 7936 00:26:33.278 }, 00:26:33.278 { 00:26:33.278 "name": "BaseBdev2", 00:26:33.278 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:33.278 "is_configured": true, 00:26:33.278 "data_offset": 256, 00:26:33.278 "data_size": 7936 00:26:33.278 } 00:26:33.278 ] 00:26:33.278 }' 00:26:33.278 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:33.278 08:02:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:33.847 08:02:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:34.107 [2024-07-15 08:02:18.678753] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:34.107 [2024-07-15 08:02:18.678858] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:34.107 [2024-07-15 08:02:18.678867] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:34.107 [2024-07-15 08:02:18.678884] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:34.107 [2024-07-15 08:02:18.681316] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27855e0 00:26:34.107 [2024-07-15 08:02:18.682852] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:34.107 08:02:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:35.047 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:35.047 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:35.047 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:35.047 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:35.047 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:35.047 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.047 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.306 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:35.306 "name": "raid_bdev1", 00:26:35.306 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:35.306 "strip_size_kb": 0, 00:26:35.306 "state": "online", 00:26:35.306 "raid_level": "raid1", 00:26:35.306 "superblock": true, 00:26:35.306 "num_base_bdevs": 2, 00:26:35.306 "num_base_bdevs_discovered": 2, 00:26:35.306 "num_base_bdevs_operational": 2, 00:26:35.306 "process": { 00:26:35.306 "type": "rebuild", 00:26:35.306 "target": "spare", 00:26:35.307 "progress": { 00:26:35.307 "blocks": 2816, 00:26:35.307 "percent": 35 00:26:35.307 } 00:26:35.307 }, 00:26:35.307 "base_bdevs_list": [ 00:26:35.307 { 00:26:35.307 "name": "spare", 00:26:35.307 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:35.307 "is_configured": true, 00:26:35.307 "data_offset": 256, 00:26:35.307 "data_size": 7936 00:26:35.307 }, 00:26:35.307 { 00:26:35.307 "name": "BaseBdev2", 00:26:35.307 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:35.307 "is_configured": true, 00:26:35.307 "data_offset": 256, 00:26:35.307 "data_size": 7936 00:26:35.307 } 00:26:35.307 ] 00:26:35.307 }' 00:26:35.307 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:35.307 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:35.307 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:35.307 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:35.307 08:02:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:35.567 [2024-07-15 08:02:20.159403] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:35.567 [2024-07-15 08:02:20.191755] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:35.567 [2024-07-15 08:02:20.191785] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:35.567 [2024-07-15 08:02:20.191795] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:35.567 [2024-07-15 08:02:20.191799] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.567 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.826 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:35.826 "name": "raid_bdev1", 00:26:35.826 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:35.826 "strip_size_kb": 0, 00:26:35.826 "state": "online", 00:26:35.826 "raid_level": "raid1", 00:26:35.826 "superblock": true, 00:26:35.826 "num_base_bdevs": 2, 00:26:35.826 "num_base_bdevs_discovered": 1, 00:26:35.826 "num_base_bdevs_operational": 1, 00:26:35.826 "base_bdevs_list": [ 00:26:35.826 { 00:26:35.826 "name": null, 00:26:35.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:35.826 "is_configured": false, 00:26:35.826 "data_offset": 256, 00:26:35.826 "data_size": 7936 00:26:35.826 }, 00:26:35.826 { 00:26:35.826 "name": "BaseBdev2", 00:26:35.826 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:35.826 "is_configured": true, 00:26:35.826 "data_offset": 256, 00:26:35.826 "data_size": 7936 00:26:35.826 } 00:26:35.826 ] 00:26:35.826 }' 00:26:35.826 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:35.826 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:36.395 08:02:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:36.395 [2024-07-15 08:02:21.105758] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:36.395 [2024-07-15 08:02:21.105789] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:36.395 [2024-07-15 08:02:21.105804] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x277d440 00:26:36.395 [2024-07-15 08:02:21.105810] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:36.395 [2024-07-15 08:02:21.105949] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:36.395 [2024-07-15 08:02:21.105958] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:36.395 [2024-07-15 08:02:21.105996] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:36.395 [2024-07-15 08:02:21.106003] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:36.395 [2024-07-15 08:02:21.106008] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:36.395 [2024-07-15 08:02:21.106019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:36.395 [2024-07-15 08:02:21.108319] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x277dce0 00:26:36.395 [2024-07-15 08:02:21.109453] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:36.395 spare 00:26:36.395 08:02:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:37.773 "name": "raid_bdev1", 00:26:37.773 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:37.773 "strip_size_kb": 0, 00:26:37.773 "state": "online", 00:26:37.773 "raid_level": "raid1", 00:26:37.773 "superblock": true, 00:26:37.773 "num_base_bdevs": 2, 00:26:37.773 "num_base_bdevs_discovered": 2, 00:26:37.773 "num_base_bdevs_operational": 2, 00:26:37.773 "process": { 00:26:37.773 "type": "rebuild", 00:26:37.773 "target": "spare", 00:26:37.773 "progress": { 00:26:37.773 "blocks": 2816, 00:26:37.773 "percent": 35 00:26:37.773 } 00:26:37.773 }, 00:26:37.773 "base_bdevs_list": [ 00:26:37.773 { 00:26:37.773 "name": "spare", 00:26:37.773 "uuid": "a915a23a-32c1-526c-a711-28c4465f3daf", 00:26:37.773 "is_configured": true, 00:26:37.773 "data_offset": 256, 00:26:37.773 "data_size": 7936 00:26:37.773 }, 00:26:37.773 { 00:26:37.773 "name": "BaseBdev2", 00:26:37.773 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:37.773 "is_configured": true, 00:26:37.773 "data_offset": 256, 00:26:37.773 "data_size": 7936 00:26:37.773 } 00:26:37.773 ] 00:26:37.773 }' 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:37.773 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:38.032 [2024-07-15 08:02:22.590294] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:38.032 [2024-07-15 08:02:22.618324] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:38.032 [2024-07-15 08:02:22.618353] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:38.032 [2024-07-15 08:02:22.618362] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:38.032 [2024-07-15 08:02:22.618367] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.032 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.292 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:38.292 "name": "raid_bdev1", 00:26:38.292 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:38.292 "strip_size_kb": 0, 00:26:38.292 "state": "online", 00:26:38.292 "raid_level": "raid1", 00:26:38.292 "superblock": true, 00:26:38.292 "num_base_bdevs": 2, 00:26:38.292 "num_base_bdevs_discovered": 1, 00:26:38.292 "num_base_bdevs_operational": 1, 00:26:38.292 "base_bdevs_list": [ 00:26:38.292 { 00:26:38.292 "name": null, 00:26:38.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.292 "is_configured": false, 00:26:38.292 "data_offset": 256, 00:26:38.292 "data_size": 7936 00:26:38.292 }, 00:26:38.292 { 00:26:38.292 "name": "BaseBdev2", 00:26:38.292 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:38.292 "is_configured": true, 00:26:38.292 "data_offset": 256, 00:26:38.292 "data_size": 7936 00:26:38.292 } 00:26:38.292 ] 00:26:38.292 }' 00:26:38.292 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:38.292 08:02:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.871 "name": "raid_bdev1", 00:26:38.871 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:38.871 "strip_size_kb": 0, 00:26:38.871 "state": "online", 00:26:38.871 "raid_level": "raid1", 00:26:38.871 "superblock": true, 00:26:38.871 "num_base_bdevs": 2, 00:26:38.871 "num_base_bdevs_discovered": 1, 00:26:38.871 "num_base_bdevs_operational": 1, 00:26:38.871 "base_bdevs_list": [ 00:26:38.871 { 00:26:38.871 "name": null, 00:26:38.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.871 "is_configured": false, 00:26:38.871 "data_offset": 256, 00:26:38.871 "data_size": 7936 00:26:38.871 }, 00:26:38.871 { 00:26:38.871 "name": "BaseBdev2", 00:26:38.871 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:38.871 "is_configured": true, 00:26:38.871 "data_offset": 256, 00:26:38.871 "data_size": 7936 00:26:38.871 } 00:26:38.871 ] 00:26:38.871 }' 00:26:38.871 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:39.129 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:39.129 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:39.129 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:39.129 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:39.389 08:02:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:39.389 [2024-07-15 08:02:24.061992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:39.389 [2024-07-15 08:02:24.062021] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:39.389 [2024-07-15 08:02:24.062033] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x277e200 00:26:39.389 [2024-07-15 08:02:24.062038] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:39.389 [2024-07-15 08:02:24.062160] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:39.389 [2024-07-15 08:02:24.062169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:39.389 [2024-07-15 08:02:24.062199] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:39.389 [2024-07-15 08:02:24.062205] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:39.389 [2024-07-15 08:02:24.062211] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:39.389 BaseBdev1 00:26:39.389 08:02:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:40.327 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:40.587 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.587 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.587 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:40.587 "name": "raid_bdev1", 00:26:40.587 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:40.587 "strip_size_kb": 0, 00:26:40.587 "state": "online", 00:26:40.587 "raid_level": "raid1", 00:26:40.587 "superblock": true, 00:26:40.587 "num_base_bdevs": 2, 00:26:40.587 "num_base_bdevs_discovered": 1, 00:26:40.587 "num_base_bdevs_operational": 1, 00:26:40.587 "base_bdevs_list": [ 00:26:40.587 { 00:26:40.587 "name": null, 00:26:40.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.587 "is_configured": false, 00:26:40.587 "data_offset": 256, 00:26:40.587 "data_size": 7936 00:26:40.587 }, 00:26:40.587 { 00:26:40.587 "name": "BaseBdev2", 00:26:40.587 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:40.587 "is_configured": true, 00:26:40.587 "data_offset": 256, 00:26:40.587 "data_size": 7936 00:26:40.587 } 00:26:40.587 ] 00:26:40.587 }' 00:26:40.587 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:40.587 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:41.158 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:41.158 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.158 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:41.158 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:41.158 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.158 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.158 08:02:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.417 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:41.417 "name": "raid_bdev1", 00:26:41.417 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:41.417 "strip_size_kb": 0, 00:26:41.417 "state": "online", 00:26:41.417 "raid_level": "raid1", 00:26:41.417 "superblock": true, 00:26:41.417 "num_base_bdevs": 2, 00:26:41.417 "num_base_bdevs_discovered": 1, 00:26:41.417 "num_base_bdevs_operational": 1, 00:26:41.417 "base_bdevs_list": [ 00:26:41.417 { 00:26:41.417 "name": null, 00:26:41.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.417 "is_configured": false, 00:26:41.417 "data_offset": 256, 00:26:41.417 "data_size": 7936 00:26:41.417 }, 00:26:41.417 { 00:26:41.417 "name": "BaseBdev2", 00:26:41.417 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:41.417 "is_configured": true, 00:26:41.417 "data_offset": 256, 00:26:41.417 "data_size": 7936 00:26:41.417 } 00:26:41.417 ] 00:26:41.417 }' 00:26:41.417 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:41.417 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:41.417 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:41.417 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:41.417 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:41.417 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:41.418 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:41.678 [2024-07-15 08:02:26.323872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:41.678 [2024-07-15 08:02:26.323962] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:41.678 [2024-07-15 08:02:26.323970] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:41.678 request: 00:26:41.678 { 00:26:41.678 "base_bdev": "BaseBdev1", 00:26:41.678 "raid_bdev": "raid_bdev1", 00:26:41.678 "method": "bdev_raid_add_base_bdev", 00:26:41.678 "req_id": 1 00:26:41.678 } 00:26:41.678 Got JSON-RPC error response 00:26:41.678 response: 00:26:41.678 { 00:26:41.678 "code": -22, 00:26:41.678 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:41.678 } 00:26:41.678 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:26:41.678 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:41.678 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:41.678 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:41.678 08:02:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.621 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.881 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.881 "name": "raid_bdev1", 00:26:42.881 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:42.881 "strip_size_kb": 0, 00:26:42.881 "state": "online", 00:26:42.881 "raid_level": "raid1", 00:26:42.881 "superblock": true, 00:26:42.881 "num_base_bdevs": 2, 00:26:42.881 "num_base_bdevs_discovered": 1, 00:26:42.881 "num_base_bdevs_operational": 1, 00:26:42.881 "base_bdevs_list": [ 00:26:42.881 { 00:26:42.881 "name": null, 00:26:42.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.881 "is_configured": false, 00:26:42.881 "data_offset": 256, 00:26:42.881 "data_size": 7936 00:26:42.881 }, 00:26:42.881 { 00:26:42.881 "name": "BaseBdev2", 00:26:42.881 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:42.881 "is_configured": true, 00:26:42.881 "data_offset": 256, 00:26:42.881 "data_size": 7936 00:26:42.881 } 00:26:42.881 ] 00:26:42.881 }' 00:26:42.881 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.881 08:02:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:43.449 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:43.449 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:43.449 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:43.449 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:43.449 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:43.449 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.449 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:43.708 "name": "raid_bdev1", 00:26:43.708 "uuid": "eee4bb98-3f2b-4d8f-99cb-fba99d64522b", 00:26:43.708 "strip_size_kb": 0, 00:26:43.708 "state": "online", 00:26:43.708 "raid_level": "raid1", 00:26:43.708 "superblock": true, 00:26:43.708 "num_base_bdevs": 2, 00:26:43.708 "num_base_bdevs_discovered": 1, 00:26:43.708 "num_base_bdevs_operational": 1, 00:26:43.708 "base_bdevs_list": [ 00:26:43.708 { 00:26:43.708 "name": null, 00:26:43.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:43.708 "is_configured": false, 00:26:43.708 "data_offset": 256, 00:26:43.708 "data_size": 7936 00:26:43.708 }, 00:26:43.708 { 00:26:43.708 "name": "BaseBdev2", 00:26:43.708 "uuid": "da839ab8-996d-5b9a-9735-8879ae55765c", 00:26:43.708 "is_configured": true, 00:26:43.708 "data_offset": 256, 00:26:43.708 "data_size": 7936 00:26:43.708 } 00:26:43.708 ] 00:26:43.708 }' 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 1765328 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1765328 ']' 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1765328 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1765328 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1765328' 00:26:43.708 killing process with pid 1765328 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1765328 00:26:43.708 Received shutdown signal, test time was about 60.000000 seconds 00:26:43.708 00:26:43.708 Latency(us) 00:26:43.708 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:43.708 =================================================================================================================== 00:26:43.708 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:43.708 [2024-07-15 08:02:28.379046] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:43.708 [2024-07-15 08:02:28.379107] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:43.708 [2024-07-15 08:02:28.379138] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:43.708 [2024-07-15 08:02:28.379144] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x29139f0 name raid_bdev1, state offline 00:26:43.708 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1765328 00:26:43.708 [2024-07-15 08:02:28.394646] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:43.968 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:26:43.968 00:26:43.968 real 0m25.541s 00:26:43.968 user 0m40.639s 00:26:43.968 sys 0m2.656s 00:26:43.968 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:43.968 08:02:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:26:43.968 ************************************ 00:26:43.968 END TEST raid_rebuild_test_sb_md_interleaved 00:26:43.968 ************************************ 00:26:43.968 08:02:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:43.968 08:02:28 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:26:43.968 08:02:28 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:26:43.968 08:02:28 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1765328 ']' 00:26:43.968 08:02:28 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1765328 00:26:43.968 08:02:28 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:26:43.968 00:26:43.968 real 16m24.341s 00:26:43.968 user 28m11.345s 00:26:43.968 sys 2m24.351s 00:26:43.968 08:02:28 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:43.968 08:02:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:43.968 ************************************ 00:26:43.968 END TEST bdev_raid 00:26:43.968 ************************************ 00:26:43.968 08:02:28 -- common/autotest_common.sh@1142 -- # return 0 00:26:43.968 08:02:28 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:26:43.968 08:02:28 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:26:43.968 08:02:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:43.968 08:02:28 -- common/autotest_common.sh@10 -- # set +x 00:26:43.968 ************************************ 00:26:43.968 START TEST bdevperf_config 00:26:43.968 ************************************ 00:26:43.968 08:02:28 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:26:44.228 * Looking for test storage... 00:26:44.228 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:44.228 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:44.228 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:44.228 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:44.228 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:44.228 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:44.228 08:02:28 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:46.765 08:02:31 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 08:02:28.909328] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:46.765 [2024-07-15 08:02:28.909383] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1769931 ] 00:26:46.765 Using job config with 4 jobs 00:26:46.765 [2024-07-15 08:02:28.992511] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:46.765 [2024-07-15 08:02:29.071927] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:46.765 cpumask for '\''job0'\'' is too big 00:26:46.765 cpumask for '\''job1'\'' is too big 00:26:46.765 cpumask for '\''job2'\'' is too big 00:26:46.765 cpumask for '\''job3'\'' is too big 00:26:46.765 Running I/O for 2 seconds... 00:26:46.765 00:26:46.765 Latency(us) 00:26:46.765 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.01 28007.99 27.35 0.00 0.00 9139.28 1600.59 14014.62 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 28017.71 27.36 0.00 0.00 9118.18 1600.59 12401.43 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 27996.08 27.34 0.00 0.00 9108.02 1587.99 10838.65 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 27974.46 27.32 0.00 0.00 9097.96 1581.69 9628.75 00:26:46.765 =================================================================================================================== 00:26:46.765 Total : 111996.25 109.37 0.00 0.00 9115.83 1581.69 14014.62' 00:26:46.765 08:02:31 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 08:02:28.909328] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:46.765 [2024-07-15 08:02:28.909383] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1769931 ] 00:26:46.765 Using job config with 4 jobs 00:26:46.765 [2024-07-15 08:02:28.992511] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:46.765 [2024-07-15 08:02:29.071927] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:46.765 cpumask for '\''job0'\'' is too big 00:26:46.765 cpumask for '\''job1'\'' is too big 00:26:46.765 cpumask for '\''job2'\'' is too big 00:26:46.765 cpumask for '\''job3'\'' is too big 00:26:46.765 Running I/O for 2 seconds... 00:26:46.765 00:26:46.765 Latency(us) 00:26:46.765 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.01 28007.99 27.35 0.00 0.00 9139.28 1600.59 14014.62 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 28017.71 27.36 0.00 0.00 9118.18 1600.59 12401.43 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 27996.08 27.34 0.00 0.00 9108.02 1587.99 10838.65 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 27974.46 27.32 0.00 0.00 9097.96 1581.69 9628.75 00:26:46.765 =================================================================================================================== 00:26:46.765 Total : 111996.25 109.37 0.00 0.00 9115.83 1581.69 14014.62' 00:26:46.765 08:02:31 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 08:02:28.909328] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:46.765 [2024-07-15 08:02:28.909383] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1769931 ] 00:26:46.765 Using job config with 4 jobs 00:26:46.765 [2024-07-15 08:02:28.992511] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:46.765 [2024-07-15 08:02:29.071927] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:46.765 cpumask for '\''job0'\'' is too big 00:26:46.765 cpumask for '\''job1'\'' is too big 00:26:46.765 cpumask for '\''job2'\'' is too big 00:26:46.765 cpumask for '\''job3'\'' is too big 00:26:46.765 Running I/O for 2 seconds... 00:26:46.765 00:26:46.765 Latency(us) 00:26:46.765 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.01 28007.99 27.35 0.00 0.00 9139.28 1600.59 14014.62 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 28017.71 27.36 0.00 0.00 9118.18 1600.59 12401.43 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 27996.08 27.34 0.00 0.00 9108.02 1587.99 10838.65 00:26:46.765 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:46.765 Malloc0 : 2.02 27974.46 27.32 0.00 0.00 9097.96 1581.69 9628.75 00:26:46.765 =================================================================================================================== 00:26:46.765 Total : 111996.25 109.37 0.00 0.00 9115.83 1581.69 14014.62' 00:26:46.765 08:02:31 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:46.765 08:02:31 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:46.765 08:02:31 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:26:46.765 08:02:31 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:46.765 [2024-07-15 08:02:31.409011] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:46.765 [2024-07-15 08:02:31.409061] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1770389 ] 00:26:46.765 [2024-07-15 08:02:31.518168] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:47.025 [2024-07-15 08:02:31.591514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:47.025 cpumask for 'job0' is too big 00:26:47.025 cpumask for 'job1' is too big 00:26:47.025 cpumask for 'job2' is too big 00:26:47.025 cpumask for 'job3' is too big 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:26:49.565 Running I/O for 2 seconds... 00:26:49.565 00:26:49.565 Latency(us) 00:26:49.565 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:49.565 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:49.565 Malloc0 : 2.01 28212.06 27.55 0.00 0.00 9065.90 1613.19 14014.62 00:26:49.565 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:49.565 Malloc0 : 2.02 28190.07 27.53 0.00 0.00 9054.47 1625.80 12300.60 00:26:49.565 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:49.565 Malloc0 : 2.02 28168.18 27.51 0.00 0.00 9045.22 1594.29 10737.82 00:26:49.565 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:26:49.565 Malloc0 : 2.02 28240.93 27.58 0.00 0.00 9005.28 784.54 9326.28 00:26:49.565 =================================================================================================================== 00:26:49.565 Total : 112811.23 110.17 0.00 0.00 9042.67 784.54 14014.62' 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:49.565 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:26:49.565 08:02:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:49.566 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:49.566 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:49.566 08:02:33 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 08:02:33.927309] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:52.104 [2024-07-15 08:02:33.927364] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1770708 ] 00:26:52.104 Using job config with 3 jobs 00:26:52.104 [2024-07-15 08:02:34.028034] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:52.104 [2024-07-15 08:02:34.099570] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:52.104 cpumask for '\''job0'\'' is too big 00:26:52.104 cpumask for '\''job1'\'' is too big 00:26:52.104 cpumask for '\''job2'\'' is too big 00:26:52.104 Running I/O for 2 seconds... 00:26:52.104 00:26:52.104 Latency(us) 00:26:52.104 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.01 38388.11 37.49 0.00 0.00 6671.60 1575.38 9830.40 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.02 38358.07 37.46 0.00 0.00 6664.43 1569.08 8267.62 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.02 38328.24 37.43 0.00 0.00 6656.72 1550.18 6906.49 00:26:52.104 =================================================================================================================== 00:26:52.104 Total : 115074.42 112.38 0.00 0.00 6664.25 1550.18 9830.40' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 08:02:33.927309] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:52.104 [2024-07-15 08:02:33.927364] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1770708 ] 00:26:52.104 Using job config with 3 jobs 00:26:52.104 [2024-07-15 08:02:34.028034] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:52.104 [2024-07-15 08:02:34.099570] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:52.104 cpumask for '\''job0'\'' is too big 00:26:52.104 cpumask for '\''job1'\'' is too big 00:26:52.104 cpumask for '\''job2'\'' is too big 00:26:52.104 Running I/O for 2 seconds... 00:26:52.104 00:26:52.104 Latency(us) 00:26:52.104 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.01 38388.11 37.49 0.00 0.00 6671.60 1575.38 9830.40 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.02 38358.07 37.46 0.00 0.00 6664.43 1569.08 8267.62 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.02 38328.24 37.43 0.00 0.00 6656.72 1550.18 6906.49 00:26:52.104 =================================================================================================================== 00:26:52.104 Total : 115074.42 112.38 0.00 0.00 6664.25 1550.18 9830.40' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 08:02:33.927309] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:52.104 [2024-07-15 08:02:33.927364] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1770708 ] 00:26:52.104 Using job config with 3 jobs 00:26:52.104 [2024-07-15 08:02:34.028034] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:52.104 [2024-07-15 08:02:34.099570] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:52.104 cpumask for '\''job0'\'' is too big 00:26:52.104 cpumask for '\''job1'\'' is too big 00:26:52.104 cpumask for '\''job2'\'' is too big 00:26:52.104 Running I/O for 2 seconds... 00:26:52.104 00:26:52.104 Latency(us) 00:26:52.104 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.01 38388.11 37.49 0.00 0.00 6671.60 1575.38 9830.40 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.02 38358.07 37.46 0.00 0.00 6664.43 1569.08 8267.62 00:26:52.104 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:26:52.104 Malloc0 : 2.02 38328.24 37.43 0.00 0.00 6656.72 1550.18 6906.49 00:26:52.104 =================================================================================================================== 00:26:52.104 Total : 115074.42 112.38 0.00 0.00 6664.25 1550.18 9830.40' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:52.104 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:52.104 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:52.104 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:52.104 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:26:52.104 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:26:52.104 08:02:36 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:54.707 08:02:38 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 08:02:36.464783] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:54.707 [2024-07-15 08:02:36.464842] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1771233 ] 00:26:54.707 Using job config with 4 jobs 00:26:54.707 [2024-07-15 08:02:36.567364] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:54.707 [2024-07-15 08:02:36.644867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:54.707 cpumask for '\''job0'\'' is too big 00:26:54.707 cpumask for '\''job1'\'' is too big 00:26:54.707 cpumask for '\''job2'\'' is too big 00:26:54.707 cpumask for '\''job3'\'' is too big 00:26:54.707 Running I/O for 2 seconds... 00:26:54.707 00:26:54.707 Latency(us) 00:26:54.707 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 14010.98 13.68 0.00 0.00 18250.45 3251.59 28230.89 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.03 13999.78 13.67 0.00 0.00 18250.63 3906.95 28230.89 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 13988.87 13.66 0.00 0.00 18211.35 3188.58 24903.68 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.03 13977.77 13.65 0.00 0.00 18209.33 3906.95 24903.68 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 13966.86 13.64 0.00 0.00 18167.70 3188.58 21677.29 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.04 13955.72 13.63 0.00 0.00 18166.89 3932.16 21677.29 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.04 14038.38 13.71 0.00 0.00 18005.81 3112.96 18551.73 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.04 14027.29 13.70 0.00 0.00 18004.83 2419.79 18551.73 00:26:54.707 =================================================================================================================== 00:26:54.707 Total : 111965.65 109.34 0.00 0.00 18158.03 2419.79 28230.89' 00:26:54.707 08:02:38 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 08:02:36.464783] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:54.707 [2024-07-15 08:02:36.464842] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1771233 ] 00:26:54.707 Using job config with 4 jobs 00:26:54.707 [2024-07-15 08:02:36.567364] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:54.707 [2024-07-15 08:02:36.644867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:54.707 cpumask for '\''job0'\'' is too big 00:26:54.707 cpumask for '\''job1'\'' is too big 00:26:54.707 cpumask for '\''job2'\'' is too big 00:26:54.707 cpumask for '\''job3'\'' is too big 00:26:54.707 Running I/O for 2 seconds... 00:26:54.707 00:26:54.707 Latency(us) 00:26:54.707 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 14010.98 13.68 0.00 0.00 18250.45 3251.59 28230.89 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.03 13999.78 13.67 0.00 0.00 18250.63 3906.95 28230.89 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 13988.87 13.66 0.00 0.00 18211.35 3188.58 24903.68 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.03 13977.77 13.65 0.00 0.00 18209.33 3906.95 24903.68 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 13966.86 13.64 0.00 0.00 18167.70 3188.58 21677.29 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.04 13955.72 13.63 0.00 0.00 18166.89 3932.16 21677.29 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.04 14038.38 13.71 0.00 0.00 18005.81 3112.96 18551.73 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.04 14027.29 13.70 0.00 0.00 18004.83 2419.79 18551.73 00:26:54.707 =================================================================================================================== 00:26:54.707 Total : 111965.65 109.34 0.00 0.00 18158.03 2419.79 28230.89' 00:26:54.707 08:02:38 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 08:02:36.464783] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:54.707 [2024-07-15 08:02:36.464842] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1771233 ] 00:26:54.707 Using job config with 4 jobs 00:26:54.707 [2024-07-15 08:02:36.567364] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:54.707 [2024-07-15 08:02:36.644867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:54.707 cpumask for '\''job0'\'' is too big 00:26:54.707 cpumask for '\''job1'\'' is too big 00:26:54.707 cpumask for '\''job2'\'' is too big 00:26:54.707 cpumask for '\''job3'\'' is too big 00:26:54.707 Running I/O for 2 seconds... 00:26:54.707 00:26:54.707 Latency(us) 00:26:54.707 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 14010.98 13.68 0.00 0.00 18250.45 3251.59 28230.89 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.03 13999.78 13.67 0.00 0.00 18250.63 3906.95 28230.89 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 13988.87 13.66 0.00 0.00 18211.35 3188.58 24903.68 00:26:54.707 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc1 : 2.03 13977.77 13.65 0.00 0.00 18209.33 3906.95 24903.68 00:26:54.707 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.707 Malloc0 : 2.03 13966.86 13.64 0.00 0.00 18167.70 3188.58 21677.29 00:26:54.708 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.708 Malloc1 : 2.04 13955.72 13.63 0.00 0.00 18166.89 3932.16 21677.29 00:26:54.708 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.708 Malloc0 : 2.04 14038.38 13.71 0.00 0.00 18005.81 3112.96 18551.73 00:26:54.708 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:26:54.708 Malloc1 : 2.04 14027.29 13.70 0.00 0.00 18004.83 2419.79 18551.73 00:26:54.708 =================================================================================================================== 00:26:54.708 Total : 111965.65 109.34 0.00 0.00 18158.03 2419.79 28230.89' 00:26:54.708 08:02:38 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:26:54.708 08:02:38 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:26:54.708 08:02:38 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:26:54.708 08:02:38 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:26:54.708 08:02:38 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:26:54.708 08:02:38 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:26:54.708 00:26:54.708 real 0m10.245s 00:26:54.708 user 0m9.300s 00:26:54.708 sys 0m0.792s 00:26:54.708 08:02:38 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:54.708 08:02:38 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:26:54.708 ************************************ 00:26:54.708 END TEST bdevperf_config 00:26:54.708 ************************************ 00:26:54.708 08:02:38 -- common/autotest_common.sh@1142 -- # return 0 00:26:54.708 08:02:38 -- spdk/autotest.sh@192 -- # uname -s 00:26:54.708 08:02:38 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:26:54.708 08:02:38 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:54.708 08:02:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:26:54.708 08:02:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:54.708 08:02:38 -- common/autotest_common.sh@10 -- # set +x 00:26:54.708 ************************************ 00:26:54.708 START TEST reactor_set_interrupt 00:26:54.708 ************************************ 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:54.708 * Looking for test storage... 00:26:54.708 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:54.708 08:02:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:26:54.708 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:26:54.708 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:54.708 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:54.708 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:26:54.708 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:54.708 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:26:54.708 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:26:54.708 08:02:39 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:26:54.709 08:02:39 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:26:54.709 08:02:39 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:26:54.709 #define SPDK_CONFIG_H 00:26:54.709 #define SPDK_CONFIG_APPS 1 00:26:54.709 #define SPDK_CONFIG_ARCH native 00:26:54.709 #undef SPDK_CONFIG_ASAN 00:26:54.709 #undef SPDK_CONFIG_AVAHI 00:26:54.709 #undef SPDK_CONFIG_CET 00:26:54.709 #define SPDK_CONFIG_COVERAGE 1 00:26:54.709 #define SPDK_CONFIG_CROSS_PREFIX 00:26:54.709 #define SPDK_CONFIG_CRYPTO 1 00:26:54.709 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:26:54.709 #undef SPDK_CONFIG_CUSTOMOCF 00:26:54.709 #undef SPDK_CONFIG_DAOS 00:26:54.709 #define SPDK_CONFIG_DAOS_DIR 00:26:54.709 #define SPDK_CONFIG_DEBUG 1 00:26:54.709 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:26:54.709 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:26:54.709 #define SPDK_CONFIG_DPDK_INC_DIR 00:26:54.709 #define SPDK_CONFIG_DPDK_LIB_DIR 00:26:54.709 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:26:54.709 #undef SPDK_CONFIG_DPDK_UADK 00:26:54.709 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:26:54.709 #define SPDK_CONFIG_EXAMPLES 1 00:26:54.709 #undef SPDK_CONFIG_FC 00:26:54.709 #define SPDK_CONFIG_FC_PATH 00:26:54.709 #define SPDK_CONFIG_FIO_PLUGIN 1 00:26:54.709 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:26:54.709 #undef SPDK_CONFIG_FUSE 00:26:54.709 #undef SPDK_CONFIG_FUZZER 00:26:54.709 #define SPDK_CONFIG_FUZZER_LIB 00:26:54.709 #undef SPDK_CONFIG_GOLANG 00:26:54.709 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:26:54.709 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:26:54.709 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:26:54.709 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:26:54.709 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:26:54.709 #undef SPDK_CONFIG_HAVE_LIBBSD 00:26:54.709 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:26:54.709 #define SPDK_CONFIG_IDXD 1 00:26:54.709 #define SPDK_CONFIG_IDXD_KERNEL 1 00:26:54.709 #define SPDK_CONFIG_IPSEC_MB 1 00:26:54.709 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:26:54.709 #define SPDK_CONFIG_ISAL 1 00:26:54.709 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:26:54.709 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:26:54.709 #define SPDK_CONFIG_LIBDIR 00:26:54.709 #undef SPDK_CONFIG_LTO 00:26:54.709 #define SPDK_CONFIG_MAX_LCORES 128 00:26:54.709 #define SPDK_CONFIG_NVME_CUSE 1 00:26:54.709 #undef SPDK_CONFIG_OCF 00:26:54.709 #define SPDK_CONFIG_OCF_PATH 00:26:54.709 #define SPDK_CONFIG_OPENSSL_PATH 00:26:54.709 #undef SPDK_CONFIG_PGO_CAPTURE 00:26:54.709 #define SPDK_CONFIG_PGO_DIR 00:26:54.709 #undef SPDK_CONFIG_PGO_USE 00:26:54.709 #define SPDK_CONFIG_PREFIX /usr/local 00:26:54.709 #undef SPDK_CONFIG_RAID5F 00:26:54.709 #undef SPDK_CONFIG_RBD 00:26:54.709 #define SPDK_CONFIG_RDMA 1 00:26:54.709 #define SPDK_CONFIG_RDMA_PROV verbs 00:26:54.709 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:26:54.709 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:26:54.709 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:26:54.709 #define SPDK_CONFIG_SHARED 1 00:26:54.709 #undef SPDK_CONFIG_SMA 00:26:54.709 #define SPDK_CONFIG_TESTS 1 00:26:54.709 #undef SPDK_CONFIG_TSAN 00:26:54.709 #define SPDK_CONFIG_UBLK 1 00:26:54.709 #define SPDK_CONFIG_UBSAN 1 00:26:54.709 #undef SPDK_CONFIG_UNIT_TESTS 00:26:54.709 #undef SPDK_CONFIG_URING 00:26:54.709 #define SPDK_CONFIG_URING_PATH 00:26:54.709 #undef SPDK_CONFIG_URING_ZNS 00:26:54.709 #undef SPDK_CONFIG_USDT 00:26:54.709 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:26:54.709 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:26:54.709 #undef SPDK_CONFIG_VFIO_USER 00:26:54.709 #define SPDK_CONFIG_VFIO_USER_DIR 00:26:54.709 #define SPDK_CONFIG_VHOST 1 00:26:54.709 #define SPDK_CONFIG_VIRTIO 1 00:26:54.709 #undef SPDK_CONFIG_VTUNE 00:26:54.709 #define SPDK_CONFIG_VTUNE_DIR 00:26:54.709 #define SPDK_CONFIG_WERROR 1 00:26:54.709 #define SPDK_CONFIG_WPDK_DIR 00:26:54.709 #undef SPDK_CONFIG_XNVME 00:26:54.709 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:54.709 08:02:39 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:54.709 08:02:39 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:54.709 08:02:39 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:54.709 08:02:39 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:54.709 08:02:39 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:26:54.709 08:02:39 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:26:54.709 08:02:39 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:26:54.709 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:26:54.710 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j128 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 1771666 ]] 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 1771666 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.MWI0wZ 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.MWI0wZ/tests/interrupt /tmp/spdk.MWI0wZ 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=954712064 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4329717760 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=123730505728 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=129376292864 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5645787136 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=64683433984 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688144384 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=25865379840 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=25875259392 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9879552 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:26:54.711 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=339968 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=163840 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=64687554560 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688148480 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=593920 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=12937621504 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=12937625600 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:26:54.712 * Looking for test storage... 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=123730505728 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7860379648 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:54.712 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1771713 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:54.712 08:02:39 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1771713 /var/tmp/spdk.sock 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1771713 ']' 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:54.712 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:54.712 08:02:39 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:54.712 [2024-07-15 08:02:39.355787] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:54.712 [2024-07-15 08:02:39.355838] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1771713 ] 00:26:54.712 [2024-07-15 08:02:39.444141] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:54.972 [2024-07-15 08:02:39.509052] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:54.972 [2024-07-15 08:02:39.509202] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:54.972 [2024-07-15 08:02:39.509203] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:54.972 [2024-07-15 08:02:39.558760] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:55.542 08:02:40 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:55.542 08:02:40 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:26:55.542 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:26:55.542 08:02:40 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:55.801 Malloc0 00:26:55.801 Malloc1 00:26:55.801 Malloc2 00:26:55.801 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:26:55.801 08:02:40 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:26:55.801 08:02:40 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:26:55.801 08:02:40 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:26:55.801 5000+0 records in 00:26:55.801 5000+0 records out 00:26:55.801 10240000 bytes (10 MB, 9.8 MiB) copied, 0.016456 s, 622 MB/s 00:26:55.801 08:02:40 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:26:56.061 AIO0 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1771713 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1771713 without_thd 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1771713 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:56.061 08:02:40 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:26:56.321 08:02:40 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:26:56.581 spdk_thread ids are 1 on reactor0. 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1771713 0 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1771713 0 idle 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1771713 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1771713 -w 256 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1771713 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.29 reactor_0' 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1771713 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.29 reactor_0 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1771713 1 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1771713 1 idle 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1771713 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1771713 -w 256 00:26:56.581 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1771716 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_1' 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1771716 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_1 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1771713 2 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1771713 2 idle 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1771713 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1771713 -w 256 00:26:56.841 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:57.101 08:02:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1771717 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_2' 00:26:57.101 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1771717 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_2 00:26:57.101 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:57.101 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:57.101 08:02:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:57.101 08:02:41 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:57.102 08:02:41 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:57.102 08:02:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:57.102 08:02:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:57.102 08:02:41 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:57.102 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:26:57.102 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:26:57.102 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:26:57.102 [2024-07-15 08:02:41.806049] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:57.102 08:02:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:26:57.362 [2024-07-15 08:02:42.033590] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:26:57.362 [2024-07-15 08:02:42.034213] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:57.362 08:02:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:26:57.621 [2024-07-15 08:02:42.281540] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:26:57.621 [2024-07-15 08:02:42.281821] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1771713 0 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1771713 0 busy 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1771713 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1771713 -w 256 00:26:57.621 08:02:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1771713 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.74 reactor_0' 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1771713 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.74 reactor_0 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1771713 2 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1771713 2 busy 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1771713 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1771713 -w 256 00:26:57.881 08:02:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1771717 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.36 reactor_2' 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1771717 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.36 reactor_2 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:26:58.141 [2024-07-15 08:02:42.841540] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:26:58.141 [2024-07-15 08:02:42.841630] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1771713 2 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1771713 2 idle 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1771713 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1771713 -w 256 00:26:58.141 08:02:42 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1771717 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.55 reactor_2' 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1771717 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.55 reactor_2 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:58.401 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:26:58.660 [2024-07-15 08:02:43.229541] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:26:58.660 [2024-07-15 08:02:43.229923] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:26:58.660 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:26:58.660 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:26:58.660 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:26:58.920 [2024-07-15 08:02:43.474057] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1771713 0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1771713 0 idle 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1771713 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1771713 -w 256 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1771713 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:01.49 reactor_0' 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1771713 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:01.49 reactor_0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:26:58.920 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1771713 00:26:58.920 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1771713 ']' 00:26:58.920 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1771713 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1771713 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1771713' 00:26:59.181 killing process with pid 1771713 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1771713 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1771713 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1772606 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1772606 /var/tmp/spdk.sock 00:26:59.181 08:02:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1772606 ']' 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:59.181 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:59.181 08:02:43 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:26:59.442 [2024-07-15 08:02:43.947443] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:26:59.442 [2024-07-15 08:02:43.947500] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1772606 ] 00:26:59.442 [2024-07-15 08:02:44.036570] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:59.442 [2024-07-15 08:02:44.107017] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:59.442 [2024-07-15 08:02:44.107162] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:59.442 [2024-07-15 08:02:44.107162] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:59.442 [2024-07-15 08:02:44.157291] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:27:00.382 08:02:44 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:00.382 08:02:44 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:27:00.382 08:02:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:27:00.382 08:02:44 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:00.382 Malloc0 00:27:00.382 Malloc1 00:27:00.382 Malloc2 00:27:00.382 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:27:00.382 08:02:45 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:27:00.382 08:02:45 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:27:00.382 08:02:45 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:27:00.382 5000+0 records in 00:27:00.382 5000+0 records out 00:27:00.382 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0175108 s, 585 MB/s 00:27:00.382 08:02:45 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:27:00.643 AIO0 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1772606 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1772606 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1772606 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:27:00.643 08:02:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:27:00.903 08:02:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:27:01.164 spdk_thread ids are 1 on reactor0. 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1772606 0 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1772606 0 idle 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1772606 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1772606 -w 256 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1772606 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.29 reactor_0' 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1772606 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.29 reactor_0 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1772606 1 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1772606 1 idle 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1772606 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1772606 -w 256 00:27:01.164 08:02:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1772644 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_1' 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1772644 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_1 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1772606 2 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1772606 2 idle 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1772606 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1772606 -w 256 00:27:01.424 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1772645 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_2' 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1772645 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.00 reactor_2 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:27:01.685 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:27:01.685 [2024-07-15 08:02:46.435794] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:27:01.685 [2024-07-15 08:02:46.436026] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:27:01.685 [2024-07-15 08:02:46.436315] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:01.944 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:27:01.944 [2024-07-15 08:02:46.640073] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:27:01.944 [2024-07-15 08:02:46.640380] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:01.944 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1772606 0 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1772606 0 busy 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1772606 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1772606 -w 256 00:27:01.945 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1772606 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.68 reactor_0' 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1772606 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.68 reactor_0 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1772606 2 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1772606 2 busy 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1772606 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1772606 -w 256 00:27:02.205 08:02:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1772645 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.35 reactor_2' 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1772645 root 20 0 128.2g 36864 23552 R 99.9 0.0 0:00.35 reactor_2 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:02.464 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:27:02.464 [2024-07-15 08:02:47.205609] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:27:02.464 [2024-07-15 08:02:47.205818] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1772606 2 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1772606 2 idle 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1772606 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1772606 -w 256 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1772645 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.56 reactor_2' 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1772645 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:00.56 reactor_2 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:02.724 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:27:02.984 [2024-07-15 08:02:47.594560] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:27:02.984 [2024-07-15 08:02:47.594875] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:27:02.984 [2024-07-15 08:02:47.594894] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1772606 0 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1772606 0 idle 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1772606 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1772606 -w 256 00:27:02.984 08:02:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1772606 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:01.45 reactor_0' 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1772606 root 20 0 128.2g 36864 23552 S 0.0 0.0 0:01.45 reactor_0 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:27:03.250 08:02:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1772606 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1772606 ']' 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1772606 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1772606 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1772606' 00:27:03.250 killing process with pid 1772606 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1772606 00:27:03.250 08:02:47 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1772606 00:27:03.512 08:02:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:27:03.512 08:02:48 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:27:03.512 00:27:03.512 real 0m9.000s 00:27:03.512 user 0m8.420s 00:27:03.512 sys 0m1.646s 00:27:03.512 08:02:48 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:03.512 08:02:48 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:27:03.512 ************************************ 00:27:03.512 END TEST reactor_set_interrupt 00:27:03.512 ************************************ 00:27:03.512 08:02:48 -- common/autotest_common.sh@1142 -- # return 0 00:27:03.512 08:02:48 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:27:03.512 08:02:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:27:03.512 08:02:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:03.512 08:02:48 -- common/autotest_common.sh@10 -- # set +x 00:27:03.512 ************************************ 00:27:03.512 START TEST reap_unregistered_poller 00:27:03.512 ************************************ 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:27:03.512 * Looking for test storage... 00:27:03.512 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:03.512 08:02:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:27:03.512 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:27:03.512 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:03.512 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:03.512 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:27:03.512 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:03.512 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:27:03.512 08:02:48 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:27:03.512 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:27:03.512 08:02:48 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:27:03.512 #define SPDK_CONFIG_H 00:27:03.512 #define SPDK_CONFIG_APPS 1 00:27:03.512 #define SPDK_CONFIG_ARCH native 00:27:03.512 #undef SPDK_CONFIG_ASAN 00:27:03.512 #undef SPDK_CONFIG_AVAHI 00:27:03.512 #undef SPDK_CONFIG_CET 00:27:03.512 #define SPDK_CONFIG_COVERAGE 1 00:27:03.512 #define SPDK_CONFIG_CROSS_PREFIX 00:27:03.512 #define SPDK_CONFIG_CRYPTO 1 00:27:03.512 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:27:03.512 #undef SPDK_CONFIG_CUSTOMOCF 00:27:03.512 #undef SPDK_CONFIG_DAOS 00:27:03.512 #define SPDK_CONFIG_DAOS_DIR 00:27:03.512 #define SPDK_CONFIG_DEBUG 1 00:27:03.512 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:27:03.512 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:27:03.512 #define SPDK_CONFIG_DPDK_INC_DIR 00:27:03.512 #define SPDK_CONFIG_DPDK_LIB_DIR 00:27:03.512 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:27:03.512 #undef SPDK_CONFIG_DPDK_UADK 00:27:03.512 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:27:03.512 #define SPDK_CONFIG_EXAMPLES 1 00:27:03.512 #undef SPDK_CONFIG_FC 00:27:03.512 #define SPDK_CONFIG_FC_PATH 00:27:03.512 #define SPDK_CONFIG_FIO_PLUGIN 1 00:27:03.512 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:27:03.512 #undef SPDK_CONFIG_FUSE 00:27:03.512 #undef SPDK_CONFIG_FUZZER 00:27:03.512 #define SPDK_CONFIG_FUZZER_LIB 00:27:03.512 #undef SPDK_CONFIG_GOLANG 00:27:03.512 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:27:03.512 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:27:03.512 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:27:03.512 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:27:03.512 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:27:03.512 #undef SPDK_CONFIG_HAVE_LIBBSD 00:27:03.512 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:27:03.512 #define SPDK_CONFIG_IDXD 1 00:27:03.512 #define SPDK_CONFIG_IDXD_KERNEL 1 00:27:03.512 #define SPDK_CONFIG_IPSEC_MB 1 00:27:03.512 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:27:03.512 #define SPDK_CONFIG_ISAL 1 00:27:03.512 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:27:03.512 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:27:03.512 #define SPDK_CONFIG_LIBDIR 00:27:03.512 #undef SPDK_CONFIG_LTO 00:27:03.512 #define SPDK_CONFIG_MAX_LCORES 128 00:27:03.512 #define SPDK_CONFIG_NVME_CUSE 1 00:27:03.512 #undef SPDK_CONFIG_OCF 00:27:03.512 #define SPDK_CONFIG_OCF_PATH 00:27:03.512 #define SPDK_CONFIG_OPENSSL_PATH 00:27:03.512 #undef SPDK_CONFIG_PGO_CAPTURE 00:27:03.512 #define SPDK_CONFIG_PGO_DIR 00:27:03.512 #undef SPDK_CONFIG_PGO_USE 00:27:03.512 #define SPDK_CONFIG_PREFIX /usr/local 00:27:03.512 #undef SPDK_CONFIG_RAID5F 00:27:03.512 #undef SPDK_CONFIG_RBD 00:27:03.513 #define SPDK_CONFIG_RDMA 1 00:27:03.513 #define SPDK_CONFIG_RDMA_PROV verbs 00:27:03.513 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:27:03.513 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:27:03.513 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:27:03.513 #define SPDK_CONFIG_SHARED 1 00:27:03.513 #undef SPDK_CONFIG_SMA 00:27:03.513 #define SPDK_CONFIG_TESTS 1 00:27:03.513 #undef SPDK_CONFIG_TSAN 00:27:03.513 #define SPDK_CONFIG_UBLK 1 00:27:03.513 #define SPDK_CONFIG_UBSAN 1 00:27:03.513 #undef SPDK_CONFIG_UNIT_TESTS 00:27:03.513 #undef SPDK_CONFIG_URING 00:27:03.513 #define SPDK_CONFIG_URING_PATH 00:27:03.513 #undef SPDK_CONFIG_URING_ZNS 00:27:03.513 #undef SPDK_CONFIG_USDT 00:27:03.513 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:27:03.513 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:27:03.513 #undef SPDK_CONFIG_VFIO_USER 00:27:03.513 #define SPDK_CONFIG_VFIO_USER_DIR 00:27:03.513 #define SPDK_CONFIG_VHOST 1 00:27:03.513 #define SPDK_CONFIG_VIRTIO 1 00:27:03.513 #undef SPDK_CONFIG_VTUNE 00:27:03.513 #define SPDK_CONFIG_VTUNE_DIR 00:27:03.513 #define SPDK_CONFIG_WERROR 1 00:27:03.513 #define SPDK_CONFIG_WPDK_DIR 00:27:03.513 #undef SPDK_CONFIG_XNVME 00:27:03.513 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:27:03.513 08:02:48 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:27:03.513 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:03.775 08:02:48 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:03.775 08:02:48 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:03.775 08:02:48 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:03.775 08:02:48 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:03.775 08:02:48 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:03.775 08:02:48 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:03.775 08:02:48 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:27:03.775 08:02:48 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:27:03.775 08:02:48 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:27:03.775 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:27:03.776 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j128 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 1773409 ]] 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 1773409 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.fAZCTR 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.fAZCTR/tests/interrupt /tmp/spdk.fAZCTR 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=954712064 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4329717760 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=123730337792 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=129376292864 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5645955072 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=64683433984 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688144384 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=25865379840 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=25875259392 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9879552 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=339968 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=163840 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=64687554560 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688148480 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=593920 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=12937621504 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=12937625600 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:27:03.777 * Looking for test storage... 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=123730337792 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:27:03.777 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7860547584 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:03.778 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1773450 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1773450 /var/tmp/spdk.sock 00:27:03.778 08:02:48 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 1773450 ']' 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:03.778 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:03.778 08:02:48 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:27:03.778 [2024-07-15 08:02:48.439303] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:27:03.778 [2024-07-15 08:02:48.439382] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1773450 ] 00:27:04.038 [2024-07-15 08:02:48.529343] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:04.038 [2024-07-15 08:02:48.623519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:04.038 [2024-07-15 08:02:48.623672] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:04.038 [2024-07-15 08:02:48.623672] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:04.038 [2024-07-15 08:02:48.694772] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:27:04.608 08:02:49 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:04.608 08:02:49 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:27:04.608 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:27:04.608 08:02:49 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:04.608 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:27:04.608 08:02:49 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:27:04.608 08:02:49 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:04.608 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:27:04.608 "name": "app_thread", 00:27:04.608 "id": 1, 00:27:04.608 "active_pollers": [], 00:27:04.608 "timed_pollers": [ 00:27:04.608 { 00:27:04.608 "name": "rpc_subsystem_poll_servers", 00:27:04.608 "id": 1, 00:27:04.608 "state": "waiting", 00:27:04.608 "run_count": 0, 00:27:04.608 "busy_count": 0, 00:27:04.608 "period_ticks": 10400000 00:27:04.608 } 00:27:04.608 ], 00:27:04.608 "paused_pollers": [] 00:27:04.608 }' 00:27:04.608 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:27:04.869 5000+0 records in 00:27:04.869 5000+0 records out 00:27:04.869 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0180865 s, 566 MB/s 00:27:04.869 08:02:49 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:27:05.129 AIO0 00:27:05.129 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:05.389 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:27:05.389 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:27:05.389 08:02:49 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:27:05.389 08:02:49 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:05.389 08:02:49 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:27:05.389 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:27:05.389 "name": "app_thread", 00:27:05.389 "id": 1, 00:27:05.389 "active_pollers": [], 00:27:05.389 "timed_pollers": [ 00:27:05.389 { 00:27:05.389 "name": "rpc_subsystem_poll_servers", 00:27:05.389 "id": 1, 00:27:05.389 "state": "waiting", 00:27:05.389 "run_count": 0, 00:27:05.389 "busy_count": 0, 00:27:05.389 "period_ticks": 10400000 00:27:05.389 } 00:27:05.389 ], 00:27:05.389 "paused_pollers": [] 00:27:05.389 }' 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:27:05.389 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1773450 00:27:05.389 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 1773450 ']' 00:27:05.389 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 1773450 00:27:05.389 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:27:05.389 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:05.389 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1773450 00:27:05.650 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:05.650 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:05.650 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1773450' 00:27:05.650 killing process with pid 1773450 00:27:05.650 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 1773450 00:27:05.650 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 1773450 00:27:05.650 08:02:50 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:27:05.650 08:02:50 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:27:05.650 00:27:05.650 real 0m2.223s 00:27:05.650 user 0m1.338s 00:27:05.650 sys 0m0.573s 00:27:05.650 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:05.650 08:02:50 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:27:05.650 ************************************ 00:27:05.650 END TEST reap_unregistered_poller 00:27:05.650 ************************************ 00:27:05.650 08:02:50 -- common/autotest_common.sh@1142 -- # return 0 00:27:05.650 08:02:50 -- spdk/autotest.sh@198 -- # uname -s 00:27:05.650 08:02:50 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:27:05.650 08:02:50 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:27:05.650 08:02:50 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:27:05.650 08:02:50 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:27:05.650 08:02:50 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:27:05.650 08:02:50 -- spdk/autotest.sh@260 -- # timing_exit lib 00:27:05.650 08:02:50 -- common/autotest_common.sh@728 -- # xtrace_disable 00:27:05.650 08:02:50 -- common/autotest_common.sh@10 -- # set +x 00:27:05.911 08:02:50 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:27:05.911 08:02:50 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:27:05.911 08:02:50 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:05.911 08:02:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:05.911 08:02:50 -- common/autotest_common.sh@10 -- # set +x 00:27:05.911 ************************************ 00:27:05.911 START TEST compress_compdev 00:27:05.911 ************************************ 00:27:05.911 08:02:50 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:27:05.911 * Looking for test storage... 00:27:05.911 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:27:05.911 08:02:50 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:27:05.911 08:02:50 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:05.911 08:02:50 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:05.911 08:02:50 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:05.911 08:02:50 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:05.911 08:02:50 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:05.911 08:02:50 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:05.911 08:02:50 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:05.911 08:02:50 compress_compdev -- paths/export.sh@5 -- # export PATH 00:27:05.912 08:02:50 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:27:05.912 08:02:50 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1773860 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1773860 00:27:05.912 08:02:50 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:27:05.912 08:02:50 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1773860 ']' 00:27:05.912 08:02:50 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:05.912 08:02:50 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:05.912 08:02:50 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:05.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:05.912 08:02:50 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:05.912 08:02:50 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:06.173 [2024-07-15 08:02:50.668694] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:27:06.173 [2024-07-15 08:02:50.668764] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1773860 ] 00:27:06.173 [2024-07-15 08:02:50.750466] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:06.173 [2024-07-15 08:02:50.851781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:06.173 [2024-07-15 08:02:50.851823] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:06.743 [2024-07-15 08:02:51.397748] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:27:07.002 08:02:51 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:07.002 08:02:51 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:27:07.002 08:02:51 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:27:07.002 08:02:51 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:07.002 08:02:51 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:10.303 [2024-07-15 08:02:54.571953] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1730670 PMD being used: compress_qat 00:27:10.303 08:02:54 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:10.303 08:02:54 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:10.303 08:02:54 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:10.303 08:02:54 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:10.303 08:02:54 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:10.303 08:02:54 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:10.303 08:02:54 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:10.303 08:02:54 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:10.303 [ 00:27:10.303 { 00:27:10.303 "name": "Nvme0n1", 00:27:10.303 "aliases": [ 00:27:10.303 "34abcc89-ba95-4f6e-97c1-f90cee4fb4e5" 00:27:10.303 ], 00:27:10.303 "product_name": "NVMe disk", 00:27:10.303 "block_size": 512, 00:27:10.303 "num_blocks": 3907029168, 00:27:10.303 "uuid": "34abcc89-ba95-4f6e-97c1-f90cee4fb4e5", 00:27:10.303 "assigned_rate_limits": { 00:27:10.303 "rw_ios_per_sec": 0, 00:27:10.303 "rw_mbytes_per_sec": 0, 00:27:10.303 "r_mbytes_per_sec": 0, 00:27:10.303 "w_mbytes_per_sec": 0 00:27:10.303 }, 00:27:10.303 "claimed": false, 00:27:10.303 "zoned": false, 00:27:10.303 "supported_io_types": { 00:27:10.303 "read": true, 00:27:10.303 "write": true, 00:27:10.303 "unmap": true, 00:27:10.303 "flush": true, 00:27:10.303 "reset": true, 00:27:10.303 "nvme_admin": true, 00:27:10.303 "nvme_io": true, 00:27:10.303 "nvme_io_md": false, 00:27:10.303 "write_zeroes": true, 00:27:10.303 "zcopy": false, 00:27:10.303 "get_zone_info": false, 00:27:10.303 "zone_management": false, 00:27:10.303 "zone_append": false, 00:27:10.303 "compare": false, 00:27:10.303 "compare_and_write": false, 00:27:10.303 "abort": true, 00:27:10.303 "seek_hole": false, 00:27:10.303 "seek_data": false, 00:27:10.303 "copy": false, 00:27:10.303 "nvme_iov_md": false 00:27:10.303 }, 00:27:10.303 "driver_specific": { 00:27:10.303 "nvme": [ 00:27:10.303 { 00:27:10.303 "pci_address": "0000:65:00.0", 00:27:10.303 "trid": { 00:27:10.303 "trtype": "PCIe", 00:27:10.303 "traddr": "0000:65:00.0" 00:27:10.303 }, 00:27:10.303 "ctrlr_data": { 00:27:10.303 "cntlid": 0, 00:27:10.303 "vendor_id": "0x8086", 00:27:10.303 "model_number": "INTEL SSDPE2KX020T8", 00:27:10.303 "serial_number": "PHLJ9512038S2P0BGN", 00:27:10.303 "firmware_revision": "VDV10184", 00:27:10.303 "oacs": { 00:27:10.303 "security": 0, 00:27:10.303 "format": 1, 00:27:10.303 "firmware": 1, 00:27:10.303 "ns_manage": 1 00:27:10.303 }, 00:27:10.303 "multi_ctrlr": false, 00:27:10.303 "ana_reporting": false 00:27:10.303 }, 00:27:10.303 "vs": { 00:27:10.303 "nvme_version": "1.2" 00:27:10.303 }, 00:27:10.303 "ns_data": { 00:27:10.303 "id": 1, 00:27:10.303 "can_share": false 00:27:10.303 } 00:27:10.303 } 00:27:10.303 ], 00:27:10.303 "mp_policy": "active_passive" 00:27:10.303 } 00:27:10.303 } 00:27:10.303 ] 00:27:10.303 08:02:54 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:10.303 08:02:54 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:10.564 [2024-07-15 08:02:55.177096] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x157e730 PMD being used: compress_qat 00:27:11.640 43af2157-b456-4070-9fb1-33e2150e5180 00:27:11.640 08:02:56 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:11.901 0bf631ec-1a3c-43a4-8bde-1989440684f4 00:27:11.901 08:02:56 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:11.901 08:02:56 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:11.901 08:02:56 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:11.901 08:02:56 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:11.901 08:02:56 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:11.901 08:02:56 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:11.901 08:02:56 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:12.162 08:02:56 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:12.162 [ 00:27:12.162 { 00:27:12.162 "name": "0bf631ec-1a3c-43a4-8bde-1989440684f4", 00:27:12.162 "aliases": [ 00:27:12.162 "lvs0/lv0" 00:27:12.162 ], 00:27:12.162 "product_name": "Logical Volume", 00:27:12.162 "block_size": 512, 00:27:12.162 "num_blocks": 204800, 00:27:12.162 "uuid": "0bf631ec-1a3c-43a4-8bde-1989440684f4", 00:27:12.162 "assigned_rate_limits": { 00:27:12.162 "rw_ios_per_sec": 0, 00:27:12.162 "rw_mbytes_per_sec": 0, 00:27:12.162 "r_mbytes_per_sec": 0, 00:27:12.162 "w_mbytes_per_sec": 0 00:27:12.162 }, 00:27:12.162 "claimed": false, 00:27:12.162 "zoned": false, 00:27:12.162 "supported_io_types": { 00:27:12.162 "read": true, 00:27:12.162 "write": true, 00:27:12.162 "unmap": true, 00:27:12.162 "flush": false, 00:27:12.162 "reset": true, 00:27:12.162 "nvme_admin": false, 00:27:12.162 "nvme_io": false, 00:27:12.162 "nvme_io_md": false, 00:27:12.162 "write_zeroes": true, 00:27:12.162 "zcopy": false, 00:27:12.162 "get_zone_info": false, 00:27:12.162 "zone_management": false, 00:27:12.162 "zone_append": false, 00:27:12.162 "compare": false, 00:27:12.162 "compare_and_write": false, 00:27:12.162 "abort": false, 00:27:12.162 "seek_hole": true, 00:27:12.162 "seek_data": true, 00:27:12.162 "copy": false, 00:27:12.162 "nvme_iov_md": false 00:27:12.162 }, 00:27:12.162 "driver_specific": { 00:27:12.162 "lvol": { 00:27:12.162 "lvol_store_uuid": "43af2157-b456-4070-9fb1-33e2150e5180", 00:27:12.162 "base_bdev": "Nvme0n1", 00:27:12.162 "thin_provision": true, 00:27:12.162 "num_allocated_clusters": 0, 00:27:12.162 "snapshot": false, 00:27:12.162 "clone": false, 00:27:12.162 "esnap_clone": false 00:27:12.162 } 00:27:12.162 } 00:27:12.162 } 00:27:12.162 ] 00:27:12.162 08:02:56 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:12.422 08:02:56 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:12.422 08:02:56 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:12.422 [2024-07-15 08:02:57.107834] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:12.422 COMP_lvs0/lv0 00:27:12.422 08:02:57 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:12.422 08:02:57 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:12.422 08:02:57 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:12.422 08:02:57 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:12.422 08:02:57 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:12.422 08:02:57 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:12.422 08:02:57 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:12.683 08:02:57 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:12.943 [ 00:27:12.943 { 00:27:12.943 "name": "COMP_lvs0/lv0", 00:27:12.943 "aliases": [ 00:27:12.943 "593dd400-a247-5103-b538-a9d25dd27691" 00:27:12.943 ], 00:27:12.943 "product_name": "compress", 00:27:12.943 "block_size": 512, 00:27:12.943 "num_blocks": 200704, 00:27:12.943 "uuid": "593dd400-a247-5103-b538-a9d25dd27691", 00:27:12.943 "assigned_rate_limits": { 00:27:12.944 "rw_ios_per_sec": 0, 00:27:12.944 "rw_mbytes_per_sec": 0, 00:27:12.944 "r_mbytes_per_sec": 0, 00:27:12.944 "w_mbytes_per_sec": 0 00:27:12.944 }, 00:27:12.944 "claimed": false, 00:27:12.944 "zoned": false, 00:27:12.944 "supported_io_types": { 00:27:12.944 "read": true, 00:27:12.944 "write": true, 00:27:12.944 "unmap": false, 00:27:12.944 "flush": false, 00:27:12.944 "reset": false, 00:27:12.944 "nvme_admin": false, 00:27:12.944 "nvme_io": false, 00:27:12.944 "nvme_io_md": false, 00:27:12.944 "write_zeroes": true, 00:27:12.944 "zcopy": false, 00:27:12.944 "get_zone_info": false, 00:27:12.944 "zone_management": false, 00:27:12.944 "zone_append": false, 00:27:12.944 "compare": false, 00:27:12.944 "compare_and_write": false, 00:27:12.944 "abort": false, 00:27:12.944 "seek_hole": false, 00:27:12.944 "seek_data": false, 00:27:12.944 "copy": false, 00:27:12.944 "nvme_iov_md": false 00:27:12.944 }, 00:27:12.944 "driver_specific": { 00:27:12.944 "compress": { 00:27:12.944 "name": "COMP_lvs0/lv0", 00:27:12.944 "base_bdev_name": "0bf631ec-1a3c-43a4-8bde-1989440684f4" 00:27:12.944 } 00:27:12.944 } 00:27:12.944 } 00:27:12.944 ] 00:27:12.944 08:02:57 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:12.944 08:02:57 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:12.944 [2024-07-15 08:02:57.657354] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1b4c1b15c0 PMD being used: compress_qat 00:27:12.944 [2024-07-15 08:02:57.660108] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1566cf0 PMD being used: compress_qat 00:27:12.944 Running I/O for 3 seconds... 00:27:16.240 00:27:16.240 Latency(us) 00:27:16.240 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:16.240 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:16.240 Verification LBA range: start 0x0 length 0x3100 00:27:16.240 COMP_lvs0/lv0 : 3.02 1534.14 5.99 0.00 0.00 20762.08 475.77 22483.89 00:27:16.240 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:16.240 Verification LBA range: start 0x3100 length 0x3100 00:27:16.240 COMP_lvs0/lv0 : 3.01 1612.98 6.30 0.00 0.00 19711.31 392.27 21576.47 00:27:16.240 =================================================================================================================== 00:27:16.240 Total : 3147.12 12.29 0.00 0.00 20223.78 392.27 22483.89 00:27:16.240 0 00:27:16.240 08:03:00 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:27:16.240 08:03:00 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:16.240 08:03:00 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:16.501 08:03:01 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:16.501 08:03:01 compress_compdev -- compress/compress.sh@78 -- # killprocess 1773860 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1773860 ']' 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1773860 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1773860 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1773860' 00:27:16.501 killing process with pid 1773860 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@967 -- # kill 1773860 00:27:16.501 Received shutdown signal, test time was about 3.000000 seconds 00:27:16.501 00:27:16.501 Latency(us) 00:27:16.501 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:16.501 =================================================================================================================== 00:27:16.501 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:16.501 08:03:01 compress_compdev -- common/autotest_common.sh@972 -- # wait 1773860 00:27:19.048 08:03:03 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:27:19.048 08:03:03 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:19.048 08:03:03 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1776127 00:27:19.048 08:03:03 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:19.048 08:03:03 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1776127 00:27:19.048 08:03:03 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:27:19.048 08:03:03 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1776127 ']' 00:27:19.048 08:03:03 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:19.048 08:03:03 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:19.048 08:03:03 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:19.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:19.048 08:03:03 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:19.048 08:03:03 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:19.048 [2024-07-15 08:03:03.683741] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:27:19.048 [2024-07-15 08:03:03.683810] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1776127 ] 00:27:19.048 [2024-07-15 08:03:03.764530] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:19.307 [2024-07-15 08:03:03.866531] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:19.307 [2024-07-15 08:03:03.866537] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:19.879 [2024-07-15 08:03:04.405666] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:27:19.879 08:03:04 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:19.879 08:03:04 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:27:19.879 08:03:04 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:27:19.879 08:03:04 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:19.879 08:03:04 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:23.177 [2024-07-15 08:03:07.587890] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2a7b670 PMD being used: compress_qat 00:27:23.177 08:03:07 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:23.177 08:03:07 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:23.177 08:03:07 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:23.177 08:03:07 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:23.177 08:03:07 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:23.177 08:03:07 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:23.177 08:03:07 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:23.177 08:03:07 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:23.437 [ 00:27:23.437 { 00:27:23.437 "name": "Nvme0n1", 00:27:23.437 "aliases": [ 00:27:23.437 "103fd362-73cd-40f3-8f84-a3f71b452619" 00:27:23.437 ], 00:27:23.437 "product_name": "NVMe disk", 00:27:23.437 "block_size": 512, 00:27:23.437 "num_blocks": 3907029168, 00:27:23.437 "uuid": "103fd362-73cd-40f3-8f84-a3f71b452619", 00:27:23.437 "assigned_rate_limits": { 00:27:23.437 "rw_ios_per_sec": 0, 00:27:23.437 "rw_mbytes_per_sec": 0, 00:27:23.437 "r_mbytes_per_sec": 0, 00:27:23.437 "w_mbytes_per_sec": 0 00:27:23.437 }, 00:27:23.438 "claimed": false, 00:27:23.438 "zoned": false, 00:27:23.438 "supported_io_types": { 00:27:23.438 "read": true, 00:27:23.438 "write": true, 00:27:23.438 "unmap": true, 00:27:23.438 "flush": true, 00:27:23.438 "reset": true, 00:27:23.438 "nvme_admin": true, 00:27:23.438 "nvme_io": true, 00:27:23.438 "nvme_io_md": false, 00:27:23.438 "write_zeroes": true, 00:27:23.438 "zcopy": false, 00:27:23.438 "get_zone_info": false, 00:27:23.438 "zone_management": false, 00:27:23.438 "zone_append": false, 00:27:23.438 "compare": false, 00:27:23.438 "compare_and_write": false, 00:27:23.438 "abort": true, 00:27:23.438 "seek_hole": false, 00:27:23.438 "seek_data": false, 00:27:23.438 "copy": false, 00:27:23.438 "nvme_iov_md": false 00:27:23.438 }, 00:27:23.438 "driver_specific": { 00:27:23.438 "nvme": [ 00:27:23.438 { 00:27:23.438 "pci_address": "0000:65:00.0", 00:27:23.438 "trid": { 00:27:23.438 "trtype": "PCIe", 00:27:23.438 "traddr": "0000:65:00.0" 00:27:23.438 }, 00:27:23.438 "ctrlr_data": { 00:27:23.438 "cntlid": 0, 00:27:23.438 "vendor_id": "0x8086", 00:27:23.438 "model_number": "INTEL SSDPE2KX020T8", 00:27:23.438 "serial_number": "PHLJ9512038S2P0BGN", 00:27:23.438 "firmware_revision": "VDV10184", 00:27:23.438 "oacs": { 00:27:23.438 "security": 0, 00:27:23.438 "format": 1, 00:27:23.438 "firmware": 1, 00:27:23.438 "ns_manage": 1 00:27:23.438 }, 00:27:23.438 "multi_ctrlr": false, 00:27:23.438 "ana_reporting": false 00:27:23.438 }, 00:27:23.438 "vs": { 00:27:23.438 "nvme_version": "1.2" 00:27:23.438 }, 00:27:23.438 "ns_data": { 00:27:23.438 "id": 1, 00:27:23.438 "can_share": false 00:27:23.438 } 00:27:23.438 } 00:27:23.438 ], 00:27:23.438 "mp_policy": "active_passive" 00:27:23.438 } 00:27:23.438 } 00:27:23.438 ] 00:27:23.438 08:03:08 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:23.438 08:03:08 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:23.698 [2024-07-15 08:03:08.245111] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28c9a10 PMD being used: compress_qat 00:27:24.640 dd5b71ec-ee77-47f2-857b-cbf4f03cc69c 00:27:24.640 08:03:09 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:24.901 1c428c3d-9d17-4e04-94db-0346334ee560 00:27:24.901 08:03:09 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:24.901 08:03:09 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:24.901 08:03:09 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:24.901 08:03:09 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:24.901 08:03:09 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:24.901 08:03:09 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:24.901 08:03:09 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:25.161 08:03:09 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:25.421 [ 00:27:25.421 { 00:27:25.421 "name": "1c428c3d-9d17-4e04-94db-0346334ee560", 00:27:25.421 "aliases": [ 00:27:25.421 "lvs0/lv0" 00:27:25.421 ], 00:27:25.421 "product_name": "Logical Volume", 00:27:25.421 "block_size": 512, 00:27:25.421 "num_blocks": 204800, 00:27:25.421 "uuid": "1c428c3d-9d17-4e04-94db-0346334ee560", 00:27:25.421 "assigned_rate_limits": { 00:27:25.421 "rw_ios_per_sec": 0, 00:27:25.421 "rw_mbytes_per_sec": 0, 00:27:25.421 "r_mbytes_per_sec": 0, 00:27:25.421 "w_mbytes_per_sec": 0 00:27:25.421 }, 00:27:25.421 "claimed": false, 00:27:25.421 "zoned": false, 00:27:25.421 "supported_io_types": { 00:27:25.421 "read": true, 00:27:25.421 "write": true, 00:27:25.421 "unmap": true, 00:27:25.421 "flush": false, 00:27:25.421 "reset": true, 00:27:25.421 "nvme_admin": false, 00:27:25.421 "nvme_io": false, 00:27:25.421 "nvme_io_md": false, 00:27:25.421 "write_zeroes": true, 00:27:25.421 "zcopy": false, 00:27:25.421 "get_zone_info": false, 00:27:25.421 "zone_management": false, 00:27:25.422 "zone_append": false, 00:27:25.422 "compare": false, 00:27:25.422 "compare_and_write": false, 00:27:25.422 "abort": false, 00:27:25.422 "seek_hole": true, 00:27:25.422 "seek_data": true, 00:27:25.422 "copy": false, 00:27:25.422 "nvme_iov_md": false 00:27:25.422 }, 00:27:25.422 "driver_specific": { 00:27:25.422 "lvol": { 00:27:25.422 "lvol_store_uuid": "dd5b71ec-ee77-47f2-857b-cbf4f03cc69c", 00:27:25.422 "base_bdev": "Nvme0n1", 00:27:25.422 "thin_provision": true, 00:27:25.422 "num_allocated_clusters": 0, 00:27:25.422 "snapshot": false, 00:27:25.422 "clone": false, 00:27:25.422 "esnap_clone": false 00:27:25.422 } 00:27:25.422 } 00:27:25.422 } 00:27:25.422 ] 00:27:25.422 08:03:09 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:25.422 08:03:09 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:27:25.422 08:03:09 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:27:25.422 [2024-07-15 08:03:10.167139] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:25.422 COMP_lvs0/lv0 00:27:25.695 08:03:10 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:25.695 08:03:10 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:25.695 08:03:10 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:25.695 08:03:10 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:25.695 08:03:10 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:25.695 08:03:10 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:25.695 08:03:10 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:25.695 08:03:10 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:25.958 [ 00:27:25.958 { 00:27:25.958 "name": "COMP_lvs0/lv0", 00:27:25.958 "aliases": [ 00:27:25.958 "241ad9d2-458c-5566-8c9a-45f26fff2755" 00:27:25.958 ], 00:27:25.958 "product_name": "compress", 00:27:25.958 "block_size": 512, 00:27:25.958 "num_blocks": 200704, 00:27:25.958 "uuid": "241ad9d2-458c-5566-8c9a-45f26fff2755", 00:27:25.958 "assigned_rate_limits": { 00:27:25.958 "rw_ios_per_sec": 0, 00:27:25.958 "rw_mbytes_per_sec": 0, 00:27:25.958 "r_mbytes_per_sec": 0, 00:27:25.958 "w_mbytes_per_sec": 0 00:27:25.958 }, 00:27:25.958 "claimed": false, 00:27:25.958 "zoned": false, 00:27:25.958 "supported_io_types": { 00:27:25.958 "read": true, 00:27:25.958 "write": true, 00:27:25.958 "unmap": false, 00:27:25.958 "flush": false, 00:27:25.958 "reset": false, 00:27:25.958 "nvme_admin": false, 00:27:25.958 "nvme_io": false, 00:27:25.958 "nvme_io_md": false, 00:27:25.958 "write_zeroes": true, 00:27:25.958 "zcopy": false, 00:27:25.958 "get_zone_info": false, 00:27:25.958 "zone_management": false, 00:27:25.958 "zone_append": false, 00:27:25.958 "compare": false, 00:27:25.958 "compare_and_write": false, 00:27:25.958 "abort": false, 00:27:25.958 "seek_hole": false, 00:27:25.958 "seek_data": false, 00:27:25.958 "copy": false, 00:27:25.958 "nvme_iov_md": false 00:27:25.958 }, 00:27:25.958 "driver_specific": { 00:27:25.958 "compress": { 00:27:25.958 "name": "COMP_lvs0/lv0", 00:27:25.958 "base_bdev_name": "1c428c3d-9d17-4e04-94db-0346334ee560" 00:27:25.958 } 00:27:25.958 } 00:27:25.958 } 00:27:25.958 ] 00:27:25.958 08:03:10 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:25.958 08:03:10 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:25.958 [2024-07-15 08:03:10.704630] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f94f01b15c0 PMD being used: compress_qat 00:27:25.958 [2024-07-15 08:03:10.707496] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x295b540 PMD being used: compress_qat 00:27:25.958 Running I/O for 3 seconds... 00:27:29.252 00:27:29.252 Latency(us) 00:27:29.252 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:29.252 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:29.252 Verification LBA range: start 0x0 length 0x3100 00:27:29.252 COMP_lvs0/lv0 : 3.01 1539.63 6.01 0.00 0.00 20712.27 456.86 21979.77 00:27:29.252 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:29.252 Verification LBA range: start 0x3100 length 0x3100 00:27:29.252 COMP_lvs0/lv0 : 3.01 1616.05 6.31 0.00 0.00 19668.76 463.16 22181.42 00:27:29.252 =================================================================================================================== 00:27:29.253 Total : 3155.68 12.33 0.00 0.00 20177.73 456.86 22181.42 00:27:29.253 0 00:27:29.253 08:03:13 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:27:29.253 08:03:13 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:29.253 08:03:13 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:29.513 08:03:14 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:29.513 08:03:14 compress_compdev -- compress/compress.sh@78 -- # killprocess 1776127 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1776127 ']' 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1776127 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1776127 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1776127' 00:27:29.513 killing process with pid 1776127 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@967 -- # kill 1776127 00:27:29.513 Received shutdown signal, test time was about 3.000000 seconds 00:27:29.513 00:27:29.513 Latency(us) 00:27:29.513 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:29.513 =================================================================================================================== 00:27:29.513 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:29.513 08:03:14 compress_compdev -- common/autotest_common.sh@972 -- # wait 1776127 00:27:32.084 08:03:16 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:27:32.084 08:03:16 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:32.084 08:03:16 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1778666 00:27:32.084 08:03:16 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:32.084 08:03:16 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1778666 00:27:32.084 08:03:16 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:27:32.084 08:03:16 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1778666 ']' 00:27:32.084 08:03:16 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:32.084 08:03:16 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:32.084 08:03:16 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:32.084 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:32.084 08:03:16 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:32.084 08:03:16 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:32.084 [2024-07-15 08:03:16.749795] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:27:32.084 [2024-07-15 08:03:16.749861] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1778666 ] 00:27:32.084 [2024-07-15 08:03:16.831219] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:32.345 [2024-07-15 08:03:16.931139] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:32.345 [2024-07-15 08:03:16.931141] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:32.915 [2024-07-15 08:03:17.462500] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:27:32.915 08:03:17 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:32.915 08:03:17 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:27:32.915 08:03:17 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:27:32.915 08:03:17 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:32.915 08:03:17 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:36.213 [2024-07-15 08:03:20.660262] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2b52670 PMD being used: compress_qat 00:27:36.213 08:03:20 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:36.213 08:03:20 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:36.213 08:03:20 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:36.213 08:03:20 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:36.213 08:03:20 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:36.213 08:03:20 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:36.213 08:03:20 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:36.213 08:03:20 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:36.473 [ 00:27:36.473 { 00:27:36.473 "name": "Nvme0n1", 00:27:36.473 "aliases": [ 00:27:36.473 "ba148600-4822-4f87-98c3-cb2c68deade3" 00:27:36.473 ], 00:27:36.473 "product_name": "NVMe disk", 00:27:36.473 "block_size": 512, 00:27:36.473 "num_blocks": 3907029168, 00:27:36.473 "uuid": "ba148600-4822-4f87-98c3-cb2c68deade3", 00:27:36.473 "assigned_rate_limits": { 00:27:36.473 "rw_ios_per_sec": 0, 00:27:36.473 "rw_mbytes_per_sec": 0, 00:27:36.473 "r_mbytes_per_sec": 0, 00:27:36.473 "w_mbytes_per_sec": 0 00:27:36.473 }, 00:27:36.473 "claimed": false, 00:27:36.473 "zoned": false, 00:27:36.473 "supported_io_types": { 00:27:36.473 "read": true, 00:27:36.473 "write": true, 00:27:36.473 "unmap": true, 00:27:36.473 "flush": true, 00:27:36.473 "reset": true, 00:27:36.473 "nvme_admin": true, 00:27:36.473 "nvme_io": true, 00:27:36.473 "nvme_io_md": false, 00:27:36.473 "write_zeroes": true, 00:27:36.473 "zcopy": false, 00:27:36.473 "get_zone_info": false, 00:27:36.473 "zone_management": false, 00:27:36.473 "zone_append": false, 00:27:36.473 "compare": false, 00:27:36.473 "compare_and_write": false, 00:27:36.473 "abort": true, 00:27:36.473 "seek_hole": false, 00:27:36.473 "seek_data": false, 00:27:36.473 "copy": false, 00:27:36.473 "nvme_iov_md": false 00:27:36.473 }, 00:27:36.473 "driver_specific": { 00:27:36.473 "nvme": [ 00:27:36.473 { 00:27:36.473 "pci_address": "0000:65:00.0", 00:27:36.473 "trid": { 00:27:36.473 "trtype": "PCIe", 00:27:36.473 "traddr": "0000:65:00.0" 00:27:36.473 }, 00:27:36.473 "ctrlr_data": { 00:27:36.473 "cntlid": 0, 00:27:36.473 "vendor_id": "0x8086", 00:27:36.473 "model_number": "INTEL SSDPE2KX020T8", 00:27:36.473 "serial_number": "PHLJ9512038S2P0BGN", 00:27:36.473 "firmware_revision": "VDV10184", 00:27:36.473 "oacs": { 00:27:36.473 "security": 0, 00:27:36.473 "format": 1, 00:27:36.473 "firmware": 1, 00:27:36.473 "ns_manage": 1 00:27:36.473 }, 00:27:36.473 "multi_ctrlr": false, 00:27:36.473 "ana_reporting": false 00:27:36.473 }, 00:27:36.473 "vs": { 00:27:36.473 "nvme_version": "1.2" 00:27:36.473 }, 00:27:36.473 "ns_data": { 00:27:36.473 "id": 1, 00:27:36.473 "can_share": false 00:27:36.473 } 00:27:36.473 } 00:27:36.473 ], 00:27:36.473 "mp_policy": "active_passive" 00:27:36.473 } 00:27:36.473 } 00:27:36.473 ] 00:27:36.473 08:03:21 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:36.473 08:03:21 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:36.733 [2024-07-15 08:03:21.321660] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x29a0a10 PMD being used: compress_qat 00:27:37.673 2f4360fa-8aae-462c-81fc-fb31afcba763 00:27:37.673 08:03:22 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:37.932 2bea015f-ae70-43cf-a466-a34f9870df89 00:27:37.932 08:03:22 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:37.932 08:03:22 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:37.932 08:03:22 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:37.932 08:03:22 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:37.932 08:03:22 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:37.932 08:03:22 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:37.932 08:03:22 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:38.192 08:03:22 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:38.192 [ 00:27:38.192 { 00:27:38.192 "name": "2bea015f-ae70-43cf-a466-a34f9870df89", 00:27:38.192 "aliases": [ 00:27:38.192 "lvs0/lv0" 00:27:38.192 ], 00:27:38.192 "product_name": "Logical Volume", 00:27:38.192 "block_size": 512, 00:27:38.192 "num_blocks": 204800, 00:27:38.192 "uuid": "2bea015f-ae70-43cf-a466-a34f9870df89", 00:27:38.192 "assigned_rate_limits": { 00:27:38.192 "rw_ios_per_sec": 0, 00:27:38.192 "rw_mbytes_per_sec": 0, 00:27:38.192 "r_mbytes_per_sec": 0, 00:27:38.192 "w_mbytes_per_sec": 0 00:27:38.192 }, 00:27:38.192 "claimed": false, 00:27:38.192 "zoned": false, 00:27:38.192 "supported_io_types": { 00:27:38.192 "read": true, 00:27:38.192 "write": true, 00:27:38.192 "unmap": true, 00:27:38.192 "flush": false, 00:27:38.192 "reset": true, 00:27:38.192 "nvme_admin": false, 00:27:38.192 "nvme_io": false, 00:27:38.192 "nvme_io_md": false, 00:27:38.192 "write_zeroes": true, 00:27:38.192 "zcopy": false, 00:27:38.192 "get_zone_info": false, 00:27:38.192 "zone_management": false, 00:27:38.192 "zone_append": false, 00:27:38.192 "compare": false, 00:27:38.192 "compare_and_write": false, 00:27:38.192 "abort": false, 00:27:38.192 "seek_hole": true, 00:27:38.192 "seek_data": true, 00:27:38.192 "copy": false, 00:27:38.192 "nvme_iov_md": false 00:27:38.192 }, 00:27:38.192 "driver_specific": { 00:27:38.192 "lvol": { 00:27:38.192 "lvol_store_uuid": "2f4360fa-8aae-462c-81fc-fb31afcba763", 00:27:38.192 "base_bdev": "Nvme0n1", 00:27:38.192 "thin_provision": true, 00:27:38.192 "num_allocated_clusters": 0, 00:27:38.192 "snapshot": false, 00:27:38.192 "clone": false, 00:27:38.192 "esnap_clone": false 00:27:38.192 } 00:27:38.192 } 00:27:38.192 } 00:27:38.192 ] 00:27:38.453 08:03:22 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:38.453 08:03:22 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:27:38.453 08:03:22 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:27:38.453 [2024-07-15 08:03:23.164012] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:38.453 COMP_lvs0/lv0 00:27:38.453 08:03:23 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:38.453 08:03:23 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:38.453 08:03:23 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:38.453 08:03:23 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:38.453 08:03:23 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:38.453 08:03:23 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:38.453 08:03:23 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:38.713 08:03:23 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:38.973 [ 00:27:38.973 { 00:27:38.973 "name": "COMP_lvs0/lv0", 00:27:38.973 "aliases": [ 00:27:38.973 "a7b0e658-e9ea-5147-a1e1-2e8b336b3cd0" 00:27:38.973 ], 00:27:38.973 "product_name": "compress", 00:27:38.973 "block_size": 4096, 00:27:38.973 "num_blocks": 25088, 00:27:38.973 "uuid": "a7b0e658-e9ea-5147-a1e1-2e8b336b3cd0", 00:27:38.973 "assigned_rate_limits": { 00:27:38.973 "rw_ios_per_sec": 0, 00:27:38.973 "rw_mbytes_per_sec": 0, 00:27:38.973 "r_mbytes_per_sec": 0, 00:27:38.973 "w_mbytes_per_sec": 0 00:27:38.973 }, 00:27:38.973 "claimed": false, 00:27:38.973 "zoned": false, 00:27:38.973 "supported_io_types": { 00:27:38.973 "read": true, 00:27:38.973 "write": true, 00:27:38.973 "unmap": false, 00:27:38.973 "flush": false, 00:27:38.973 "reset": false, 00:27:38.973 "nvme_admin": false, 00:27:38.973 "nvme_io": false, 00:27:38.973 "nvme_io_md": false, 00:27:38.973 "write_zeroes": true, 00:27:38.973 "zcopy": false, 00:27:38.973 "get_zone_info": false, 00:27:38.973 "zone_management": false, 00:27:38.973 "zone_append": false, 00:27:38.973 "compare": false, 00:27:38.973 "compare_and_write": false, 00:27:38.973 "abort": false, 00:27:38.973 "seek_hole": false, 00:27:38.973 "seek_data": false, 00:27:38.973 "copy": false, 00:27:38.973 "nvme_iov_md": false 00:27:38.973 }, 00:27:38.973 "driver_specific": { 00:27:38.973 "compress": { 00:27:38.973 "name": "COMP_lvs0/lv0", 00:27:38.973 "base_bdev_name": "2bea015f-ae70-43cf-a466-a34f9870df89" 00:27:38.973 } 00:27:38.973 } 00:27:38.973 } 00:27:38.973 ] 00:27:38.973 08:03:23 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:38.973 08:03:23 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:27:38.973 [2024-07-15 08:03:23.697500] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f21a01b15c0 PMD being used: compress_qat 00:27:38.973 [2024-07-15 08:03:23.700290] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2988d80 PMD being used: compress_qat 00:27:38.973 Running I/O for 3 seconds... 00:27:42.271 00:27:42.271 Latency(us) 00:27:42.271 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:42.271 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:27:42.271 Verification LBA range: start 0x0 length 0x3100 00:27:42.271 COMP_lvs0/lv0 : 3.01 1545.49 6.04 0.00 0.00 20637.98 354.46 23391.31 00:27:42.271 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:27:42.271 Verification LBA range: start 0x3100 length 0x3100 00:27:42.271 COMP_lvs0/lv0 : 3.01 1610.97 6.29 0.00 0.00 19743.30 258.36 21878.94 00:27:42.271 =================================================================================================================== 00:27:42.271 Total : 3156.46 12.33 0.00 0.00 20181.32 258.36 23391.31 00:27:42.271 0 00:27:42.271 08:03:26 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:27:42.271 08:03:26 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:42.271 08:03:26 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:42.531 08:03:27 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:27:42.531 08:03:27 compress_compdev -- compress/compress.sh@78 -- # killprocess 1778666 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1778666 ']' 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1778666 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1778666 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1778666' 00:27:42.531 killing process with pid 1778666 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@967 -- # kill 1778666 00:27:42.531 Received shutdown signal, test time was about 3.000000 seconds 00:27:42.531 00:27:42.531 Latency(us) 00:27:42.531 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:42.531 =================================================================================================================== 00:27:42.531 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:42.531 08:03:27 compress_compdev -- common/autotest_common.sh@972 -- # wait 1778666 00:27:45.071 08:03:29 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:27:45.071 08:03:29 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:27:45.071 08:03:29 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1780626 00:27:45.071 08:03:29 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:45.071 08:03:29 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1780626 00:27:45.071 08:03:29 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:27:45.071 08:03:29 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1780626 ']' 00:27:45.071 08:03:29 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:45.071 08:03:29 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:45.071 08:03:29 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:45.071 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:45.071 08:03:29 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:45.071 08:03:29 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:45.071 [2024-07-15 08:03:29.674914] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:27:45.071 [2024-07-15 08:03:29.674980] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1780626 ] 00:27:45.071 [2024-07-15 08:03:29.765282] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:45.331 [2024-07-15 08:03:29.860237] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:45.331 [2024-07-15 08:03:29.860389] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:45.331 [2024-07-15 08:03:29.860389] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:45.591 [2024-07-15 08:03:30.341025] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:27:45.850 08:03:30 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:45.850 08:03:30 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:27:45.850 08:03:30 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:27:45.850 08:03:30 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:45.850 08:03:30 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:49.149 [2024-07-15 08:03:33.572083] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13ad0c0 PMD being used: compress_qat 00:27:49.149 08:03:33 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:49.149 08:03:33 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:49.149 08:03:33 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:49.149 08:03:33 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:49.149 08:03:33 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:49.149 08:03:33 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:49.149 08:03:33 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:49.149 08:03:33 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:49.409 [ 00:27:49.409 { 00:27:49.409 "name": "Nvme0n1", 00:27:49.409 "aliases": [ 00:27:49.409 "bc1139c4-81f7-402a-ae92-c11f7e748faf" 00:27:49.409 ], 00:27:49.409 "product_name": "NVMe disk", 00:27:49.409 "block_size": 512, 00:27:49.409 "num_blocks": 3907029168, 00:27:49.409 "uuid": "bc1139c4-81f7-402a-ae92-c11f7e748faf", 00:27:49.409 "assigned_rate_limits": { 00:27:49.409 "rw_ios_per_sec": 0, 00:27:49.409 "rw_mbytes_per_sec": 0, 00:27:49.409 "r_mbytes_per_sec": 0, 00:27:49.409 "w_mbytes_per_sec": 0 00:27:49.409 }, 00:27:49.409 "claimed": false, 00:27:49.409 "zoned": false, 00:27:49.409 "supported_io_types": { 00:27:49.409 "read": true, 00:27:49.409 "write": true, 00:27:49.409 "unmap": true, 00:27:49.409 "flush": true, 00:27:49.409 "reset": true, 00:27:49.409 "nvme_admin": true, 00:27:49.409 "nvme_io": true, 00:27:49.409 "nvme_io_md": false, 00:27:49.409 "write_zeroes": true, 00:27:49.409 "zcopy": false, 00:27:49.409 "get_zone_info": false, 00:27:49.409 "zone_management": false, 00:27:49.409 "zone_append": false, 00:27:49.409 "compare": false, 00:27:49.409 "compare_and_write": false, 00:27:49.409 "abort": true, 00:27:49.409 "seek_hole": false, 00:27:49.409 "seek_data": false, 00:27:49.409 "copy": false, 00:27:49.409 "nvme_iov_md": false 00:27:49.409 }, 00:27:49.409 "driver_specific": { 00:27:49.409 "nvme": [ 00:27:49.409 { 00:27:49.409 "pci_address": "0000:65:00.0", 00:27:49.409 "trid": { 00:27:49.409 "trtype": "PCIe", 00:27:49.409 "traddr": "0000:65:00.0" 00:27:49.409 }, 00:27:49.409 "ctrlr_data": { 00:27:49.409 "cntlid": 0, 00:27:49.409 "vendor_id": "0x8086", 00:27:49.409 "model_number": "INTEL SSDPE2KX020T8", 00:27:49.409 "serial_number": "PHLJ9512038S2P0BGN", 00:27:49.409 "firmware_revision": "VDV10184", 00:27:49.409 "oacs": { 00:27:49.409 "security": 0, 00:27:49.409 "format": 1, 00:27:49.409 "firmware": 1, 00:27:49.409 "ns_manage": 1 00:27:49.409 }, 00:27:49.409 "multi_ctrlr": false, 00:27:49.409 "ana_reporting": false 00:27:49.409 }, 00:27:49.409 "vs": { 00:27:49.409 "nvme_version": "1.2" 00:27:49.409 }, 00:27:49.409 "ns_data": { 00:27:49.409 "id": 1, 00:27:49.409 "can_share": false 00:27:49.409 } 00:27:49.409 } 00:27:49.409 ], 00:27:49.409 "mp_policy": "active_passive" 00:27:49.409 } 00:27:49.409 } 00:27:49.409 ] 00:27:49.409 08:03:34 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:49.409 08:03:34 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:27:49.669 [2024-07-15 08:03:34.198039] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11fb4b0 PMD being used: compress_qat 00:27:50.608 a9b04960-09b6-4aa8-9b33-e80b99feff08 00:27:50.608 08:03:35 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:27:50.868 c2e06dc7-1bb0-4b53-aadc-34ad795bd58a 00:27:50.868 08:03:35 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:27:50.868 08:03:35 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:27:50.868 08:03:35 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:50.868 08:03:35 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:50.868 08:03:35 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:50.868 08:03:35 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:50.868 08:03:35 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:51.127 08:03:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:27:51.127 [ 00:27:51.127 { 00:27:51.127 "name": "c2e06dc7-1bb0-4b53-aadc-34ad795bd58a", 00:27:51.127 "aliases": [ 00:27:51.127 "lvs0/lv0" 00:27:51.127 ], 00:27:51.127 "product_name": "Logical Volume", 00:27:51.127 "block_size": 512, 00:27:51.127 "num_blocks": 204800, 00:27:51.127 "uuid": "c2e06dc7-1bb0-4b53-aadc-34ad795bd58a", 00:27:51.127 "assigned_rate_limits": { 00:27:51.127 "rw_ios_per_sec": 0, 00:27:51.127 "rw_mbytes_per_sec": 0, 00:27:51.127 "r_mbytes_per_sec": 0, 00:27:51.127 "w_mbytes_per_sec": 0 00:27:51.127 }, 00:27:51.127 "claimed": false, 00:27:51.127 "zoned": false, 00:27:51.127 "supported_io_types": { 00:27:51.127 "read": true, 00:27:51.127 "write": true, 00:27:51.127 "unmap": true, 00:27:51.127 "flush": false, 00:27:51.127 "reset": true, 00:27:51.127 "nvme_admin": false, 00:27:51.127 "nvme_io": false, 00:27:51.128 "nvme_io_md": false, 00:27:51.128 "write_zeroes": true, 00:27:51.128 "zcopy": false, 00:27:51.128 "get_zone_info": false, 00:27:51.128 "zone_management": false, 00:27:51.128 "zone_append": false, 00:27:51.128 "compare": false, 00:27:51.128 "compare_and_write": false, 00:27:51.128 "abort": false, 00:27:51.128 "seek_hole": true, 00:27:51.128 "seek_data": true, 00:27:51.128 "copy": false, 00:27:51.128 "nvme_iov_md": false 00:27:51.128 }, 00:27:51.128 "driver_specific": { 00:27:51.128 "lvol": { 00:27:51.128 "lvol_store_uuid": "a9b04960-09b6-4aa8-9b33-e80b99feff08", 00:27:51.128 "base_bdev": "Nvme0n1", 00:27:51.128 "thin_provision": true, 00:27:51.128 "num_allocated_clusters": 0, 00:27:51.128 "snapshot": false, 00:27:51.128 "clone": false, 00:27:51.128 "esnap_clone": false 00:27:51.128 } 00:27:51.128 } 00:27:51.128 } 00:27:51.128 ] 00:27:51.387 08:03:35 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:51.387 08:03:35 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:27:51.387 08:03:35 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:27:51.387 [2024-07-15 08:03:36.085685] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:27:51.387 COMP_lvs0/lv0 00:27:51.387 08:03:36 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:27:51.387 08:03:36 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:27:51.387 08:03:36 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:51.387 08:03:36 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:27:51.387 08:03:36 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:51.387 08:03:36 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:51.387 08:03:36 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:51.682 08:03:36 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:27:51.941 [ 00:27:51.941 { 00:27:51.941 "name": "COMP_lvs0/lv0", 00:27:51.941 "aliases": [ 00:27:51.941 "122d8d94-581e-5f37-b252-56310bb2403b" 00:27:51.941 ], 00:27:51.941 "product_name": "compress", 00:27:51.941 "block_size": 512, 00:27:51.941 "num_blocks": 200704, 00:27:51.941 "uuid": "122d8d94-581e-5f37-b252-56310bb2403b", 00:27:51.941 "assigned_rate_limits": { 00:27:51.941 "rw_ios_per_sec": 0, 00:27:51.941 "rw_mbytes_per_sec": 0, 00:27:51.941 "r_mbytes_per_sec": 0, 00:27:51.941 "w_mbytes_per_sec": 0 00:27:51.941 }, 00:27:51.941 "claimed": false, 00:27:51.941 "zoned": false, 00:27:51.941 "supported_io_types": { 00:27:51.941 "read": true, 00:27:51.941 "write": true, 00:27:51.941 "unmap": false, 00:27:51.941 "flush": false, 00:27:51.941 "reset": false, 00:27:51.941 "nvme_admin": false, 00:27:51.941 "nvme_io": false, 00:27:51.941 "nvme_io_md": false, 00:27:51.941 "write_zeroes": true, 00:27:51.941 "zcopy": false, 00:27:51.941 "get_zone_info": false, 00:27:51.941 "zone_management": false, 00:27:51.941 "zone_append": false, 00:27:51.941 "compare": false, 00:27:51.941 "compare_and_write": false, 00:27:51.941 "abort": false, 00:27:51.941 "seek_hole": false, 00:27:51.941 "seek_data": false, 00:27:51.941 "copy": false, 00:27:51.941 "nvme_iov_md": false 00:27:51.941 }, 00:27:51.941 "driver_specific": { 00:27:51.941 "compress": { 00:27:51.941 "name": "COMP_lvs0/lv0", 00:27:51.941 "base_bdev_name": "c2e06dc7-1bb0-4b53-aadc-34ad795bd58a" 00:27:51.941 } 00:27:51.941 } 00:27:51.941 } 00:27:51.941 ] 00:27:51.941 08:03:36 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:27:51.941 08:03:36 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:51.941 [2024-07-15 08:03:36.595056] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7e441b1350 PMD being used: compress_qat 00:27:51.941 I/O targets: 00:27:51.941 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:27:51.941 00:27:51.941 00:27:51.941 CUnit - A unit testing framework for C - Version 2.1-3 00:27:51.941 http://cunit.sourceforge.net/ 00:27:51.941 00:27:51.941 00:27:51.941 Suite: bdevio tests on: COMP_lvs0/lv0 00:27:51.941 Test: blockdev write read block ...passed 00:27:51.941 Test: blockdev write zeroes read block ...passed 00:27:51.941 Test: blockdev write zeroes read no split ...passed 00:27:51.941 Test: blockdev write zeroes read split ...passed 00:27:52.201 Test: blockdev write zeroes read split partial ...passed 00:27:52.201 Test: blockdev reset ...[2024-07-15 08:03:36.726942] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:27:52.201 passed 00:27:52.201 Test: blockdev write read 8 blocks ...passed 00:27:52.201 Test: blockdev write read size > 128k ...passed 00:27:52.201 Test: blockdev write read invalid size ...passed 00:27:52.201 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:52.201 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:52.201 Test: blockdev write read max offset ...passed 00:27:52.201 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:52.201 Test: blockdev writev readv 8 blocks ...passed 00:27:52.201 Test: blockdev writev readv 30 x 1block ...passed 00:27:52.201 Test: blockdev writev readv block ...passed 00:27:52.201 Test: blockdev writev readv size > 128k ...passed 00:27:52.201 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:52.201 Test: blockdev comparev and writev ...passed 00:27:52.201 Test: blockdev nvme passthru rw ...passed 00:27:52.201 Test: blockdev nvme passthru vendor specific ...passed 00:27:52.201 Test: blockdev nvme admin passthru ...passed 00:27:52.201 Test: blockdev copy ...passed 00:27:52.201 00:27:52.201 Run Summary: Type Total Ran Passed Failed Inactive 00:27:52.201 suites 1 1 n/a 0 0 00:27:52.201 tests 23 23 23 0 0 00:27:52.201 asserts 130 130 130 0 n/a 00:27:52.201 00:27:52.201 Elapsed time = 0.345 seconds 00:27:52.201 0 00:27:52.201 08:03:36 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:27:52.201 08:03:36 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:27:52.461 08:03:36 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:27:52.461 08:03:37 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:27:52.461 08:03:37 compress_compdev -- compress/compress.sh@62 -- # killprocess 1780626 00:27:52.461 08:03:37 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1780626 ']' 00:27:52.461 08:03:37 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1780626 00:27:52.461 08:03:37 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:27:52.461 08:03:37 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:52.461 08:03:37 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1780626 00:27:52.720 08:03:37 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:52.720 08:03:37 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:52.720 08:03:37 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1780626' 00:27:52.720 killing process with pid 1780626 00:27:52.720 08:03:37 compress_compdev -- common/autotest_common.sh@967 -- # kill 1780626 00:27:52.720 08:03:37 compress_compdev -- common/autotest_common.sh@972 -- # wait 1780626 00:27:55.259 08:03:39 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:27:55.259 08:03:39 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:27:55.259 00:27:55.259 real 0m49.192s 00:27:55.259 user 1m51.061s 00:27:55.259 sys 0m4.235s 00:27:55.259 08:03:39 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:55.259 08:03:39 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:27:55.259 ************************************ 00:27:55.259 END TEST compress_compdev 00:27:55.259 ************************************ 00:27:55.259 08:03:39 -- common/autotest_common.sh@1142 -- # return 0 00:27:55.259 08:03:39 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:27:55.259 08:03:39 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:27:55.259 08:03:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:55.259 08:03:39 -- common/autotest_common.sh@10 -- # set +x 00:27:55.259 ************************************ 00:27:55.259 START TEST compress_isal 00:27:55.259 ************************************ 00:27:55.259 08:03:39 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:27:55.259 * Looking for test storage... 00:27:55.259 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:27:55.259 08:03:39 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:27:55.259 08:03:39 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:55.259 08:03:39 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:55.259 08:03:39 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:55.259 08:03:39 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:55.259 08:03:39 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:55.259 08:03:39 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:55.259 08:03:39 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:55.260 08:03:39 compress_isal -- paths/export.sh@5 -- # export PATH 00:27:55.260 08:03:39 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@47 -- # : 0 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:27:55.260 08:03:39 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1782505 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1782505 00:27:55.260 08:03:39 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1782505 ']' 00:27:55.260 08:03:39 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:27:55.260 08:03:39 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:55.260 08:03:39 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:55.260 08:03:39 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:55.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:55.260 08:03:39 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:55.260 08:03:39 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:27:55.260 [2024-07-15 08:03:39.946526] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:27:55.260 [2024-07-15 08:03:39.946588] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1782505 ] 00:27:55.521 [2024-07-15 08:03:40.029834] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:55.521 [2024-07-15 08:03:40.134279] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:27:55.521 [2024-07-15 08:03:40.134286] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:56.091 08:03:40 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:56.091 08:03:40 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:27:56.091 08:03:40 compress_isal -- compress/compress.sh@74 -- # create_vols 00:27:56.091 08:03:40 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:27:56.091 08:03:40 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:27:59.403 08:03:43 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:27:59.403 08:03:43 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:27:59.404 08:03:43 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:59.404 08:03:43 compress_isal -- common/autotest_common.sh@899 -- # local i 00:27:59.404 08:03:43 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:59.404 08:03:43 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:59.404 08:03:43 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:27:59.404 08:03:44 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:27:59.664 [ 00:27:59.664 { 00:27:59.664 "name": "Nvme0n1", 00:27:59.664 "aliases": [ 00:27:59.664 "c70b758e-8085-4d2f-898c-088961395063" 00:27:59.664 ], 00:27:59.664 "product_name": "NVMe disk", 00:27:59.664 "block_size": 512, 00:27:59.664 "num_blocks": 3907029168, 00:27:59.664 "uuid": "c70b758e-8085-4d2f-898c-088961395063", 00:27:59.664 "assigned_rate_limits": { 00:27:59.664 "rw_ios_per_sec": 0, 00:27:59.664 "rw_mbytes_per_sec": 0, 00:27:59.664 "r_mbytes_per_sec": 0, 00:27:59.664 "w_mbytes_per_sec": 0 00:27:59.664 }, 00:27:59.664 "claimed": false, 00:27:59.664 "zoned": false, 00:27:59.664 "supported_io_types": { 00:27:59.664 "read": true, 00:27:59.664 "write": true, 00:27:59.664 "unmap": true, 00:27:59.664 "flush": true, 00:27:59.664 "reset": true, 00:27:59.664 "nvme_admin": true, 00:27:59.664 "nvme_io": true, 00:27:59.664 "nvme_io_md": false, 00:27:59.664 "write_zeroes": true, 00:27:59.664 "zcopy": false, 00:27:59.664 "get_zone_info": false, 00:27:59.664 "zone_management": false, 00:27:59.664 "zone_append": false, 00:27:59.664 "compare": false, 00:27:59.664 "compare_and_write": false, 00:27:59.664 "abort": true, 00:27:59.664 "seek_hole": false, 00:27:59.664 "seek_data": false, 00:27:59.664 "copy": false, 00:27:59.664 "nvme_iov_md": false 00:27:59.664 }, 00:27:59.664 "driver_specific": { 00:27:59.664 "nvme": [ 00:27:59.664 { 00:27:59.664 "pci_address": "0000:65:00.0", 00:27:59.664 "trid": { 00:27:59.664 "trtype": "PCIe", 00:27:59.664 "traddr": "0000:65:00.0" 00:27:59.664 }, 00:27:59.664 "ctrlr_data": { 00:27:59.664 "cntlid": 0, 00:27:59.664 "vendor_id": "0x8086", 00:27:59.664 "model_number": "INTEL SSDPE2KX020T8", 00:27:59.664 "serial_number": "PHLJ9512038S2P0BGN", 00:27:59.664 "firmware_revision": "VDV10184", 00:27:59.664 "oacs": { 00:27:59.664 "security": 0, 00:27:59.664 "format": 1, 00:27:59.664 "firmware": 1, 00:27:59.664 "ns_manage": 1 00:27:59.664 }, 00:27:59.664 "multi_ctrlr": false, 00:27:59.664 "ana_reporting": false 00:27:59.664 }, 00:27:59.664 "vs": { 00:27:59.664 "nvme_version": "1.2" 00:27:59.664 }, 00:27:59.664 "ns_data": { 00:27:59.664 "id": 1, 00:27:59.664 "can_share": false 00:27:59.664 } 00:27:59.664 } 00:27:59.664 ], 00:27:59.664 "mp_policy": "active_passive" 00:27:59.664 } 00:27:59.664 } 00:27:59.664 ] 00:27:59.664 08:03:44 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:27:59.664 08:03:44 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:01.048 0e9a1be3-b5e8-4c19-a2d5-f8d9ca9e77dd 00:28:01.048 08:03:45 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:01.048 78b3d1df-4f6e-4c9f-bf30-2e07e2435f24 00:28:01.048 08:03:45 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:01.048 08:03:45 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:01.048 08:03:45 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:01.048 08:03:45 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:01.048 08:03:45 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:01.048 08:03:45 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:01.048 08:03:45 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:01.308 08:03:45 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:01.568 [ 00:28:01.568 { 00:28:01.568 "name": "78b3d1df-4f6e-4c9f-bf30-2e07e2435f24", 00:28:01.568 "aliases": [ 00:28:01.568 "lvs0/lv0" 00:28:01.568 ], 00:28:01.568 "product_name": "Logical Volume", 00:28:01.568 "block_size": 512, 00:28:01.568 "num_blocks": 204800, 00:28:01.568 "uuid": "78b3d1df-4f6e-4c9f-bf30-2e07e2435f24", 00:28:01.568 "assigned_rate_limits": { 00:28:01.568 "rw_ios_per_sec": 0, 00:28:01.568 "rw_mbytes_per_sec": 0, 00:28:01.568 "r_mbytes_per_sec": 0, 00:28:01.568 "w_mbytes_per_sec": 0 00:28:01.568 }, 00:28:01.568 "claimed": false, 00:28:01.568 "zoned": false, 00:28:01.568 "supported_io_types": { 00:28:01.568 "read": true, 00:28:01.568 "write": true, 00:28:01.568 "unmap": true, 00:28:01.568 "flush": false, 00:28:01.568 "reset": true, 00:28:01.568 "nvme_admin": false, 00:28:01.568 "nvme_io": false, 00:28:01.568 "nvme_io_md": false, 00:28:01.568 "write_zeroes": true, 00:28:01.568 "zcopy": false, 00:28:01.568 "get_zone_info": false, 00:28:01.568 "zone_management": false, 00:28:01.568 "zone_append": false, 00:28:01.568 "compare": false, 00:28:01.568 "compare_and_write": false, 00:28:01.568 "abort": false, 00:28:01.568 "seek_hole": true, 00:28:01.568 "seek_data": true, 00:28:01.568 "copy": false, 00:28:01.568 "nvme_iov_md": false 00:28:01.568 }, 00:28:01.568 "driver_specific": { 00:28:01.568 "lvol": { 00:28:01.568 "lvol_store_uuid": "0e9a1be3-b5e8-4c19-a2d5-f8d9ca9e77dd", 00:28:01.568 "base_bdev": "Nvme0n1", 00:28:01.568 "thin_provision": true, 00:28:01.568 "num_allocated_clusters": 0, 00:28:01.568 "snapshot": false, 00:28:01.568 "clone": false, 00:28:01.568 "esnap_clone": false 00:28:01.568 } 00:28:01.568 } 00:28:01.568 } 00:28:01.568 ] 00:28:01.568 08:03:46 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:01.568 08:03:46 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:28:01.568 08:03:46 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:28:01.828 [2024-07-15 08:03:46.339963] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:01.828 COMP_lvs0/lv0 00:28:01.828 08:03:46 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:01.828 08:03:46 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:01.828 08:03:46 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:01.829 08:03:46 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:01.829 08:03:46 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:01.829 08:03:46 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:01.829 08:03:46 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:01.829 08:03:46 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:02.088 [ 00:28:02.089 { 00:28:02.089 "name": "COMP_lvs0/lv0", 00:28:02.089 "aliases": [ 00:28:02.089 "df3fcdb5-119f-57f5-91e9-3d1c74038d5d" 00:28:02.089 ], 00:28:02.089 "product_name": "compress", 00:28:02.089 "block_size": 512, 00:28:02.089 "num_blocks": 200704, 00:28:02.089 "uuid": "df3fcdb5-119f-57f5-91e9-3d1c74038d5d", 00:28:02.089 "assigned_rate_limits": { 00:28:02.089 "rw_ios_per_sec": 0, 00:28:02.089 "rw_mbytes_per_sec": 0, 00:28:02.089 "r_mbytes_per_sec": 0, 00:28:02.089 "w_mbytes_per_sec": 0 00:28:02.089 }, 00:28:02.089 "claimed": false, 00:28:02.089 "zoned": false, 00:28:02.089 "supported_io_types": { 00:28:02.089 "read": true, 00:28:02.089 "write": true, 00:28:02.089 "unmap": false, 00:28:02.089 "flush": false, 00:28:02.089 "reset": false, 00:28:02.089 "nvme_admin": false, 00:28:02.089 "nvme_io": false, 00:28:02.089 "nvme_io_md": false, 00:28:02.089 "write_zeroes": true, 00:28:02.089 "zcopy": false, 00:28:02.089 "get_zone_info": false, 00:28:02.089 "zone_management": false, 00:28:02.089 "zone_append": false, 00:28:02.089 "compare": false, 00:28:02.089 "compare_and_write": false, 00:28:02.089 "abort": false, 00:28:02.089 "seek_hole": false, 00:28:02.089 "seek_data": false, 00:28:02.089 "copy": false, 00:28:02.089 "nvme_iov_md": false 00:28:02.089 }, 00:28:02.089 "driver_specific": { 00:28:02.089 "compress": { 00:28:02.089 "name": "COMP_lvs0/lv0", 00:28:02.089 "base_bdev_name": "78b3d1df-4f6e-4c9f-bf30-2e07e2435f24" 00:28:02.089 } 00:28:02.089 } 00:28:02.089 } 00:28:02.089 ] 00:28:02.089 08:03:46 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:02.089 08:03:46 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:02.349 Running I/O for 3 seconds... 00:28:05.647 00:28:05.647 Latency(us) 00:28:05.647 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:05.647 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:05.647 Verification LBA range: start 0x0 length 0x3100 00:28:05.647 COMP_lvs0/lv0 : 3.02 1091.88 4.27 0.00 0.00 29184.44 447.41 31860.58 00:28:05.647 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:05.647 Verification LBA range: start 0x3100 length 0x3100 00:28:05.647 COMP_lvs0/lv0 : 3.01 1100.23 4.30 0.00 0.00 28915.41 200.07 30650.68 00:28:05.647 =================================================================================================================== 00:28:05.647 Total : 2192.11 8.56 0.00 0.00 29049.47 200.07 31860.58 00:28:05.647 0 00:28:05.647 08:03:49 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:28:05.647 08:03:49 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:05.647 08:03:50 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:05.647 08:03:50 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:05.647 08:03:50 compress_isal -- compress/compress.sh@78 -- # killprocess 1782505 00:28:05.647 08:03:50 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1782505 ']' 00:28:05.647 08:03:50 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1782505 00:28:05.647 08:03:50 compress_isal -- common/autotest_common.sh@953 -- # uname 00:28:05.647 08:03:50 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:05.647 08:03:50 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1782505 00:28:05.907 08:03:50 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:05.907 08:03:50 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:05.907 08:03:50 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1782505' 00:28:05.907 killing process with pid 1782505 00:28:05.907 08:03:50 compress_isal -- common/autotest_common.sh@967 -- # kill 1782505 00:28:05.907 Received shutdown signal, test time was about 3.000000 seconds 00:28:05.907 00:28:05.907 Latency(us) 00:28:05.907 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:05.907 =================================================================================================================== 00:28:05.907 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:05.907 08:03:50 compress_isal -- common/autotest_common.sh@972 -- # wait 1782505 00:28:08.448 08:03:52 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:28:08.448 08:03:52 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:28:08.448 08:03:52 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1784445 00:28:08.448 08:03:52 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:08.448 08:03:52 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1784445 00:28:08.448 08:03:52 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:28:08.448 08:03:52 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1784445 ']' 00:28:08.448 08:03:52 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:08.448 08:03:52 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:08.448 08:03:52 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:08.448 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:08.448 08:03:52 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:08.448 08:03:52 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:08.448 [2024-07-15 08:03:52.921555] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:28:08.448 [2024-07-15 08:03:52.921623] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1784445 ] 00:28:08.448 [2024-07-15 08:03:53.005812] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:08.448 [2024-07-15 08:03:53.107095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:08.448 [2024-07-15 08:03:53.107098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:09.387 08:03:53 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:09.387 08:03:53 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:28:09.387 08:03:53 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:28:09.387 08:03:53 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:09.387 08:03:53 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:12.714 08:03:56 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:12.714 08:03:56 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:28:12.714 08:03:56 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:12.714 08:03:56 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:12.714 08:03:56 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:12.714 08:03:56 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:12.714 08:03:56 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:12.714 08:03:57 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:12.714 [ 00:28:12.714 { 00:28:12.714 "name": "Nvme0n1", 00:28:12.714 "aliases": [ 00:28:12.714 "4a3bef6d-9c37-4857-8a37-a219b99b8adb" 00:28:12.714 ], 00:28:12.714 "product_name": "NVMe disk", 00:28:12.714 "block_size": 512, 00:28:12.714 "num_blocks": 3907029168, 00:28:12.714 "uuid": "4a3bef6d-9c37-4857-8a37-a219b99b8adb", 00:28:12.714 "assigned_rate_limits": { 00:28:12.714 "rw_ios_per_sec": 0, 00:28:12.714 "rw_mbytes_per_sec": 0, 00:28:12.714 "r_mbytes_per_sec": 0, 00:28:12.714 "w_mbytes_per_sec": 0 00:28:12.714 }, 00:28:12.714 "claimed": false, 00:28:12.714 "zoned": false, 00:28:12.714 "supported_io_types": { 00:28:12.714 "read": true, 00:28:12.714 "write": true, 00:28:12.714 "unmap": true, 00:28:12.714 "flush": true, 00:28:12.714 "reset": true, 00:28:12.714 "nvme_admin": true, 00:28:12.714 "nvme_io": true, 00:28:12.714 "nvme_io_md": false, 00:28:12.714 "write_zeroes": true, 00:28:12.714 "zcopy": false, 00:28:12.714 "get_zone_info": false, 00:28:12.714 "zone_management": false, 00:28:12.714 "zone_append": false, 00:28:12.714 "compare": false, 00:28:12.715 "compare_and_write": false, 00:28:12.715 "abort": true, 00:28:12.715 "seek_hole": false, 00:28:12.715 "seek_data": false, 00:28:12.715 "copy": false, 00:28:12.715 "nvme_iov_md": false 00:28:12.715 }, 00:28:12.715 "driver_specific": { 00:28:12.715 "nvme": [ 00:28:12.715 { 00:28:12.715 "pci_address": "0000:65:00.0", 00:28:12.715 "trid": { 00:28:12.715 "trtype": "PCIe", 00:28:12.715 "traddr": "0000:65:00.0" 00:28:12.715 }, 00:28:12.715 "ctrlr_data": { 00:28:12.715 "cntlid": 0, 00:28:12.715 "vendor_id": "0x8086", 00:28:12.715 "model_number": "INTEL SSDPE2KX020T8", 00:28:12.715 "serial_number": "PHLJ9512038S2P0BGN", 00:28:12.715 "firmware_revision": "VDV10184", 00:28:12.715 "oacs": { 00:28:12.715 "security": 0, 00:28:12.715 "format": 1, 00:28:12.715 "firmware": 1, 00:28:12.715 "ns_manage": 1 00:28:12.715 }, 00:28:12.715 "multi_ctrlr": false, 00:28:12.715 "ana_reporting": false 00:28:12.715 }, 00:28:12.715 "vs": { 00:28:12.715 "nvme_version": "1.2" 00:28:12.715 }, 00:28:12.715 "ns_data": { 00:28:12.715 "id": 1, 00:28:12.715 "can_share": false 00:28:12.715 } 00:28:12.715 } 00:28:12.715 ], 00:28:12.715 "mp_policy": "active_passive" 00:28:12.715 } 00:28:12.715 } 00:28:12.715 ] 00:28:12.715 08:03:57 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:12.715 08:03:57 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:14.097 7f6003d2-8c87-46ca-a3b5-83c95bb0b1ab 00:28:14.097 08:03:58 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:14.097 cf8b074b-2f30-49c9-ba28-15bea5983de8 00:28:14.097 08:03:58 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:14.097 08:03:58 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:14.097 08:03:58 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:14.097 08:03:58 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:14.097 08:03:58 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:14.097 08:03:58 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:14.097 08:03:58 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:14.359 08:03:59 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:14.625 [ 00:28:14.625 { 00:28:14.625 "name": "cf8b074b-2f30-49c9-ba28-15bea5983de8", 00:28:14.625 "aliases": [ 00:28:14.625 "lvs0/lv0" 00:28:14.625 ], 00:28:14.625 "product_name": "Logical Volume", 00:28:14.625 "block_size": 512, 00:28:14.625 "num_blocks": 204800, 00:28:14.625 "uuid": "cf8b074b-2f30-49c9-ba28-15bea5983de8", 00:28:14.625 "assigned_rate_limits": { 00:28:14.625 "rw_ios_per_sec": 0, 00:28:14.625 "rw_mbytes_per_sec": 0, 00:28:14.625 "r_mbytes_per_sec": 0, 00:28:14.625 "w_mbytes_per_sec": 0 00:28:14.625 }, 00:28:14.625 "claimed": false, 00:28:14.625 "zoned": false, 00:28:14.625 "supported_io_types": { 00:28:14.625 "read": true, 00:28:14.625 "write": true, 00:28:14.625 "unmap": true, 00:28:14.625 "flush": false, 00:28:14.625 "reset": true, 00:28:14.625 "nvme_admin": false, 00:28:14.625 "nvme_io": false, 00:28:14.625 "nvme_io_md": false, 00:28:14.625 "write_zeroes": true, 00:28:14.625 "zcopy": false, 00:28:14.625 "get_zone_info": false, 00:28:14.625 "zone_management": false, 00:28:14.625 "zone_append": false, 00:28:14.625 "compare": false, 00:28:14.625 "compare_and_write": false, 00:28:14.625 "abort": false, 00:28:14.625 "seek_hole": true, 00:28:14.625 "seek_data": true, 00:28:14.625 "copy": false, 00:28:14.625 "nvme_iov_md": false 00:28:14.625 }, 00:28:14.625 "driver_specific": { 00:28:14.625 "lvol": { 00:28:14.625 "lvol_store_uuid": "7f6003d2-8c87-46ca-a3b5-83c95bb0b1ab", 00:28:14.625 "base_bdev": "Nvme0n1", 00:28:14.625 "thin_provision": true, 00:28:14.625 "num_allocated_clusters": 0, 00:28:14.625 "snapshot": false, 00:28:14.625 "clone": false, 00:28:14.625 "esnap_clone": false 00:28:14.625 } 00:28:14.625 } 00:28:14.625 } 00:28:14.625 ] 00:28:14.625 08:03:59 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:14.625 08:03:59 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:28:14.625 08:03:59 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:28:14.885 [2024-07-15 08:03:59.433493] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:14.885 COMP_lvs0/lv0 00:28:14.885 08:03:59 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:14.885 08:03:59 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:14.885 08:03:59 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:14.885 08:03:59 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:14.885 08:03:59 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:14.885 08:03:59 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:14.885 08:03:59 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:15.145 08:03:59 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:15.145 [ 00:28:15.145 { 00:28:15.145 "name": "COMP_lvs0/lv0", 00:28:15.145 "aliases": [ 00:28:15.145 "2cefbd09-c7a9-59fb-be57-151c3910951c" 00:28:15.145 ], 00:28:15.145 "product_name": "compress", 00:28:15.145 "block_size": 512, 00:28:15.145 "num_blocks": 200704, 00:28:15.145 "uuid": "2cefbd09-c7a9-59fb-be57-151c3910951c", 00:28:15.145 "assigned_rate_limits": { 00:28:15.145 "rw_ios_per_sec": 0, 00:28:15.145 "rw_mbytes_per_sec": 0, 00:28:15.145 "r_mbytes_per_sec": 0, 00:28:15.145 "w_mbytes_per_sec": 0 00:28:15.145 }, 00:28:15.145 "claimed": false, 00:28:15.145 "zoned": false, 00:28:15.145 "supported_io_types": { 00:28:15.145 "read": true, 00:28:15.145 "write": true, 00:28:15.145 "unmap": false, 00:28:15.145 "flush": false, 00:28:15.145 "reset": false, 00:28:15.145 "nvme_admin": false, 00:28:15.146 "nvme_io": false, 00:28:15.146 "nvme_io_md": false, 00:28:15.146 "write_zeroes": true, 00:28:15.146 "zcopy": false, 00:28:15.146 "get_zone_info": false, 00:28:15.146 "zone_management": false, 00:28:15.146 "zone_append": false, 00:28:15.146 "compare": false, 00:28:15.146 "compare_and_write": false, 00:28:15.146 "abort": false, 00:28:15.146 "seek_hole": false, 00:28:15.146 "seek_data": false, 00:28:15.146 "copy": false, 00:28:15.146 "nvme_iov_md": false 00:28:15.146 }, 00:28:15.146 "driver_specific": { 00:28:15.146 "compress": { 00:28:15.146 "name": "COMP_lvs0/lv0", 00:28:15.146 "base_bdev_name": "cf8b074b-2f30-49c9-ba28-15bea5983de8" 00:28:15.146 } 00:28:15.146 } 00:28:15.146 } 00:28:15.146 ] 00:28:15.406 08:03:59 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:15.406 08:03:59 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:15.406 Running I/O for 3 seconds... 00:28:18.721 00:28:18.721 Latency(us) 00:28:18.721 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:18.721 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:18.721 Verification LBA range: start 0x0 length 0x3100 00:28:18.721 COMP_lvs0/lv0 : 3.02 1091.08 4.26 0.00 0.00 29226.82 184.32 33070.47 00:28:18.721 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:18.721 Verification LBA range: start 0x3100 length 0x3100 00:28:18.721 COMP_lvs0/lv0 : 3.02 1104.05 4.31 0.00 0.00 28800.16 294.60 33877.07 00:28:18.721 =================================================================================================================== 00:28:18.721 Total : 2195.12 8.57 0.00 0.00 29012.17 184.32 33877.07 00:28:18.721 0 00:28:18.721 08:04:03 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:28:18.721 08:04:03 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:18.721 08:04:03 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:18.981 08:04:03 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:18.981 08:04:03 compress_isal -- compress/compress.sh@78 -- # killprocess 1784445 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1784445 ']' 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1784445 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@953 -- # uname 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1784445 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1784445' 00:28:18.981 killing process with pid 1784445 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@967 -- # kill 1784445 00:28:18.981 Received shutdown signal, test time was about 3.000000 seconds 00:28:18.981 00:28:18.981 Latency(us) 00:28:18.981 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:18.981 =================================================================================================================== 00:28:18.981 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:18.981 08:04:03 compress_isal -- common/autotest_common.sh@972 -- # wait 1784445 00:28:21.532 08:04:05 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:28:21.532 08:04:05 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:28:21.532 08:04:05 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1786532 00:28:21.532 08:04:05 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:21.532 08:04:05 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1786532 00:28:21.532 08:04:05 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:28:21.532 08:04:05 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1786532 ']' 00:28:21.532 08:04:05 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:21.532 08:04:05 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:21.532 08:04:05 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:21.532 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:21.532 08:04:05 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:21.532 08:04:05 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:21.532 [2024-07-15 08:04:06.021488] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:28:21.532 [2024-07-15 08:04:06.021553] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1786532 ] 00:28:21.532 [2024-07-15 08:04:06.106302] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:21.532 [2024-07-15 08:04:06.208044] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:21.532 [2024-07-15 08:04:06.208170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:22.471 08:04:06 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:22.471 08:04:06 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:28:22.471 08:04:06 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:28:22.471 08:04:06 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:22.471 08:04:06 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:25.772 08:04:09 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:25.772 08:04:09 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:28:25.772 08:04:09 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:25.772 08:04:09 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:25.772 08:04:09 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:25.772 08:04:09 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:25.772 08:04:09 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:25.772 08:04:10 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:25.772 [ 00:28:25.772 { 00:28:25.772 "name": "Nvme0n1", 00:28:25.772 "aliases": [ 00:28:25.772 "d7360ee9-5ff9-434f-9ca9-12fb60f1ae1e" 00:28:25.772 ], 00:28:25.772 "product_name": "NVMe disk", 00:28:25.772 "block_size": 512, 00:28:25.772 "num_blocks": 3907029168, 00:28:25.772 "uuid": "d7360ee9-5ff9-434f-9ca9-12fb60f1ae1e", 00:28:25.772 "assigned_rate_limits": { 00:28:25.772 "rw_ios_per_sec": 0, 00:28:25.772 "rw_mbytes_per_sec": 0, 00:28:25.772 "r_mbytes_per_sec": 0, 00:28:25.772 "w_mbytes_per_sec": 0 00:28:25.772 }, 00:28:25.772 "claimed": false, 00:28:25.772 "zoned": false, 00:28:25.772 "supported_io_types": { 00:28:25.772 "read": true, 00:28:25.772 "write": true, 00:28:25.772 "unmap": true, 00:28:25.772 "flush": true, 00:28:25.772 "reset": true, 00:28:25.772 "nvme_admin": true, 00:28:25.772 "nvme_io": true, 00:28:25.772 "nvme_io_md": false, 00:28:25.772 "write_zeroes": true, 00:28:25.772 "zcopy": false, 00:28:25.772 "get_zone_info": false, 00:28:25.772 "zone_management": false, 00:28:25.772 "zone_append": false, 00:28:25.772 "compare": false, 00:28:25.772 "compare_and_write": false, 00:28:25.772 "abort": true, 00:28:25.772 "seek_hole": false, 00:28:25.772 "seek_data": false, 00:28:25.772 "copy": false, 00:28:25.772 "nvme_iov_md": false 00:28:25.772 }, 00:28:25.772 "driver_specific": { 00:28:25.772 "nvme": [ 00:28:25.772 { 00:28:25.772 "pci_address": "0000:65:00.0", 00:28:25.772 "trid": { 00:28:25.772 "trtype": "PCIe", 00:28:25.772 "traddr": "0000:65:00.0" 00:28:25.772 }, 00:28:25.772 "ctrlr_data": { 00:28:25.772 "cntlid": 0, 00:28:25.772 "vendor_id": "0x8086", 00:28:25.772 "model_number": "INTEL SSDPE2KX020T8", 00:28:25.772 "serial_number": "PHLJ9512038S2P0BGN", 00:28:25.772 "firmware_revision": "VDV10184", 00:28:25.772 "oacs": { 00:28:25.772 "security": 0, 00:28:25.772 "format": 1, 00:28:25.772 "firmware": 1, 00:28:25.772 "ns_manage": 1 00:28:25.772 }, 00:28:25.772 "multi_ctrlr": false, 00:28:25.772 "ana_reporting": false 00:28:25.772 }, 00:28:25.772 "vs": { 00:28:25.772 "nvme_version": "1.2" 00:28:25.772 }, 00:28:25.772 "ns_data": { 00:28:25.772 "id": 1, 00:28:25.772 "can_share": false 00:28:25.772 } 00:28:25.772 } 00:28:25.772 ], 00:28:25.772 "mp_policy": "active_passive" 00:28:25.772 } 00:28:25.772 } 00:28:25.772 ] 00:28:25.772 08:04:10 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:25.772 08:04:10 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:27.158 14e1484b-0f0d-4c39-b4e4-b1906e8e72d8 00:28:27.158 08:04:11 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:27.158 c5a35b93-4fa2-4f20-baed-1587d4c4abc3 00:28:27.420 08:04:11 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:27.420 08:04:11 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:27.420 08:04:11 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:27.420 08:04:11 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:27.420 08:04:11 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:27.420 08:04:11 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:27.420 08:04:11 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:27.420 08:04:12 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:27.681 [ 00:28:27.681 { 00:28:27.681 "name": "c5a35b93-4fa2-4f20-baed-1587d4c4abc3", 00:28:27.681 "aliases": [ 00:28:27.681 "lvs0/lv0" 00:28:27.681 ], 00:28:27.681 "product_name": "Logical Volume", 00:28:27.681 "block_size": 512, 00:28:27.681 "num_blocks": 204800, 00:28:27.681 "uuid": "c5a35b93-4fa2-4f20-baed-1587d4c4abc3", 00:28:27.681 "assigned_rate_limits": { 00:28:27.681 "rw_ios_per_sec": 0, 00:28:27.681 "rw_mbytes_per_sec": 0, 00:28:27.681 "r_mbytes_per_sec": 0, 00:28:27.681 "w_mbytes_per_sec": 0 00:28:27.681 }, 00:28:27.681 "claimed": false, 00:28:27.681 "zoned": false, 00:28:27.681 "supported_io_types": { 00:28:27.681 "read": true, 00:28:27.681 "write": true, 00:28:27.681 "unmap": true, 00:28:27.681 "flush": false, 00:28:27.681 "reset": true, 00:28:27.681 "nvme_admin": false, 00:28:27.681 "nvme_io": false, 00:28:27.681 "nvme_io_md": false, 00:28:27.681 "write_zeroes": true, 00:28:27.681 "zcopy": false, 00:28:27.681 "get_zone_info": false, 00:28:27.681 "zone_management": false, 00:28:27.681 "zone_append": false, 00:28:27.681 "compare": false, 00:28:27.681 "compare_and_write": false, 00:28:27.681 "abort": false, 00:28:27.682 "seek_hole": true, 00:28:27.682 "seek_data": true, 00:28:27.682 "copy": false, 00:28:27.682 "nvme_iov_md": false 00:28:27.682 }, 00:28:27.682 "driver_specific": { 00:28:27.682 "lvol": { 00:28:27.682 "lvol_store_uuid": "14e1484b-0f0d-4c39-b4e4-b1906e8e72d8", 00:28:27.682 "base_bdev": "Nvme0n1", 00:28:27.682 "thin_provision": true, 00:28:27.682 "num_allocated_clusters": 0, 00:28:27.682 "snapshot": false, 00:28:27.682 "clone": false, 00:28:27.682 "esnap_clone": false 00:28:27.682 } 00:28:27.682 } 00:28:27.682 } 00:28:27.682 ] 00:28:27.682 08:04:12 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:27.682 08:04:12 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:28:27.682 08:04:12 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:28:27.943 [2024-07-15 08:04:12.577202] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:27.943 COMP_lvs0/lv0 00:28:27.943 08:04:12 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:27.943 08:04:12 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:27.943 08:04:12 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:27.943 08:04:12 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:27.943 08:04:12 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:27.943 08:04:12 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:27.943 08:04:12 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:28.204 08:04:12 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:28.465 [ 00:28:28.465 { 00:28:28.465 "name": "COMP_lvs0/lv0", 00:28:28.465 "aliases": [ 00:28:28.465 "7c601df8-a770-5a0e-accd-6a9ea0c4003b" 00:28:28.465 ], 00:28:28.465 "product_name": "compress", 00:28:28.465 "block_size": 4096, 00:28:28.465 "num_blocks": 25088, 00:28:28.465 "uuid": "7c601df8-a770-5a0e-accd-6a9ea0c4003b", 00:28:28.465 "assigned_rate_limits": { 00:28:28.465 "rw_ios_per_sec": 0, 00:28:28.465 "rw_mbytes_per_sec": 0, 00:28:28.465 "r_mbytes_per_sec": 0, 00:28:28.465 "w_mbytes_per_sec": 0 00:28:28.465 }, 00:28:28.465 "claimed": false, 00:28:28.465 "zoned": false, 00:28:28.465 "supported_io_types": { 00:28:28.465 "read": true, 00:28:28.465 "write": true, 00:28:28.465 "unmap": false, 00:28:28.465 "flush": false, 00:28:28.465 "reset": false, 00:28:28.465 "nvme_admin": false, 00:28:28.465 "nvme_io": false, 00:28:28.465 "nvme_io_md": false, 00:28:28.465 "write_zeroes": true, 00:28:28.465 "zcopy": false, 00:28:28.465 "get_zone_info": false, 00:28:28.465 "zone_management": false, 00:28:28.465 "zone_append": false, 00:28:28.465 "compare": false, 00:28:28.465 "compare_and_write": false, 00:28:28.465 "abort": false, 00:28:28.465 "seek_hole": false, 00:28:28.465 "seek_data": false, 00:28:28.465 "copy": false, 00:28:28.465 "nvme_iov_md": false 00:28:28.465 }, 00:28:28.465 "driver_specific": { 00:28:28.465 "compress": { 00:28:28.465 "name": "COMP_lvs0/lv0", 00:28:28.465 "base_bdev_name": "c5a35b93-4fa2-4f20-baed-1587d4c4abc3" 00:28:28.465 } 00:28:28.465 } 00:28:28.465 } 00:28:28.465 ] 00:28:28.465 08:04:13 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:28.465 08:04:13 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:28.465 Running I/O for 3 seconds... 00:28:31.762 00:28:31.762 Latency(us) 00:28:31.762 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:31.762 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:31.762 Verification LBA range: start 0x0 length 0x3100 00:28:31.762 COMP_lvs0/lv0 : 3.02 1133.91 4.43 0.00 0.00 28086.38 526.18 29037.49 00:28:31.763 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:31.763 Verification LBA range: start 0x3100 length 0x3100 00:28:31.763 COMP_lvs0/lv0 : 3.02 1133.85 4.43 0.00 0.00 28032.19 274.12 29037.49 00:28:31.763 =================================================================================================================== 00:28:31.763 Total : 2267.76 8.86 0.00 0.00 28059.28 274.12 29037.49 00:28:31.763 0 00:28:31.763 08:04:16 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:28:31.763 08:04:16 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:31.763 08:04:16 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:32.024 08:04:16 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:32.024 08:04:16 compress_isal -- compress/compress.sh@78 -- # killprocess 1786532 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1786532 ']' 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1786532 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@953 -- # uname 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1786532 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1786532' 00:28:32.024 killing process with pid 1786532 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@967 -- # kill 1786532 00:28:32.024 Received shutdown signal, test time was about 3.000000 seconds 00:28:32.024 00:28:32.024 Latency(us) 00:28:32.024 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:32.024 =================================================================================================================== 00:28:32.024 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:32.024 08:04:16 compress_isal -- common/autotest_common.sh@972 -- # wait 1786532 00:28:34.635 08:04:19 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:28:34.635 08:04:19 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:28:34.635 08:04:19 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1788689 00:28:34.635 08:04:19 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:34.635 08:04:19 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1788689 00:28:34.635 08:04:19 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:28:34.635 08:04:19 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1788689 ']' 00:28:34.635 08:04:19 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:34.635 08:04:19 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:34.636 08:04:19 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:34.636 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:34.636 08:04:19 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:34.636 08:04:19 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:34.636 [2024-07-15 08:04:19.204469] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:28:34.636 [2024-07-15 08:04:19.204541] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1788689 ] 00:28:34.636 [2024-07-15 08:04:19.299009] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:34.896 [2024-07-15 08:04:19.395262] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:34.896 [2024-07-15 08:04:19.395418] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:34.896 [2024-07-15 08:04:19.395418] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:35.467 08:04:20 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:35.467 08:04:20 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:28:35.467 08:04:20 compress_isal -- compress/compress.sh@58 -- # create_vols 00:28:35.467 08:04:20 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:35.467 08:04:20 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:38.765 08:04:23 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:38.765 08:04:23 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:28:38.765 08:04:23 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:38.765 08:04:23 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:38.765 08:04:23 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:38.765 08:04:23 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:38.765 08:04:23 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:38.765 08:04:23 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:39.024 [ 00:28:39.024 { 00:28:39.024 "name": "Nvme0n1", 00:28:39.024 "aliases": [ 00:28:39.024 "1009d577-0111-4eba-b315-1fc8d85c15c0" 00:28:39.024 ], 00:28:39.024 "product_name": "NVMe disk", 00:28:39.024 "block_size": 512, 00:28:39.024 "num_blocks": 3907029168, 00:28:39.024 "uuid": "1009d577-0111-4eba-b315-1fc8d85c15c0", 00:28:39.024 "assigned_rate_limits": { 00:28:39.024 "rw_ios_per_sec": 0, 00:28:39.024 "rw_mbytes_per_sec": 0, 00:28:39.024 "r_mbytes_per_sec": 0, 00:28:39.024 "w_mbytes_per_sec": 0 00:28:39.024 }, 00:28:39.024 "claimed": false, 00:28:39.024 "zoned": false, 00:28:39.024 "supported_io_types": { 00:28:39.024 "read": true, 00:28:39.024 "write": true, 00:28:39.024 "unmap": true, 00:28:39.024 "flush": true, 00:28:39.024 "reset": true, 00:28:39.024 "nvme_admin": true, 00:28:39.024 "nvme_io": true, 00:28:39.024 "nvme_io_md": false, 00:28:39.024 "write_zeroes": true, 00:28:39.024 "zcopy": false, 00:28:39.024 "get_zone_info": false, 00:28:39.024 "zone_management": false, 00:28:39.024 "zone_append": false, 00:28:39.024 "compare": false, 00:28:39.024 "compare_and_write": false, 00:28:39.024 "abort": true, 00:28:39.024 "seek_hole": false, 00:28:39.024 "seek_data": false, 00:28:39.024 "copy": false, 00:28:39.024 "nvme_iov_md": false 00:28:39.024 }, 00:28:39.024 "driver_specific": { 00:28:39.024 "nvme": [ 00:28:39.024 { 00:28:39.024 "pci_address": "0000:65:00.0", 00:28:39.024 "trid": { 00:28:39.024 "trtype": "PCIe", 00:28:39.024 "traddr": "0000:65:00.0" 00:28:39.024 }, 00:28:39.024 "ctrlr_data": { 00:28:39.024 "cntlid": 0, 00:28:39.024 "vendor_id": "0x8086", 00:28:39.024 "model_number": "INTEL SSDPE2KX020T8", 00:28:39.024 "serial_number": "PHLJ9512038S2P0BGN", 00:28:39.024 "firmware_revision": "VDV10184", 00:28:39.024 "oacs": { 00:28:39.024 "security": 0, 00:28:39.024 "format": 1, 00:28:39.024 "firmware": 1, 00:28:39.024 "ns_manage": 1 00:28:39.024 }, 00:28:39.024 "multi_ctrlr": false, 00:28:39.024 "ana_reporting": false 00:28:39.024 }, 00:28:39.024 "vs": { 00:28:39.024 "nvme_version": "1.2" 00:28:39.024 }, 00:28:39.024 "ns_data": { 00:28:39.024 "id": 1, 00:28:39.024 "can_share": false 00:28:39.024 } 00:28:39.024 } 00:28:39.024 ], 00:28:39.024 "mp_policy": "active_passive" 00:28:39.024 } 00:28:39.024 } 00:28:39.024 ] 00:28:39.025 08:04:23 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:39.025 08:04:23 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:40.406 bc25b133-1d95-4e95-a4c7-8a136344709e 00:28:40.406 08:04:24 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:40.406 fddb6c00-e150-459d-b29a-372d5598372a 00:28:40.406 08:04:25 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:40.406 08:04:25 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:28:40.406 08:04:25 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:40.406 08:04:25 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:40.406 08:04:25 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:40.406 08:04:25 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:40.406 08:04:25 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:40.665 08:04:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:40.924 [ 00:28:40.924 { 00:28:40.924 "name": "fddb6c00-e150-459d-b29a-372d5598372a", 00:28:40.924 "aliases": [ 00:28:40.924 "lvs0/lv0" 00:28:40.924 ], 00:28:40.924 "product_name": "Logical Volume", 00:28:40.924 "block_size": 512, 00:28:40.924 "num_blocks": 204800, 00:28:40.924 "uuid": "fddb6c00-e150-459d-b29a-372d5598372a", 00:28:40.924 "assigned_rate_limits": { 00:28:40.924 "rw_ios_per_sec": 0, 00:28:40.924 "rw_mbytes_per_sec": 0, 00:28:40.924 "r_mbytes_per_sec": 0, 00:28:40.924 "w_mbytes_per_sec": 0 00:28:40.924 }, 00:28:40.924 "claimed": false, 00:28:40.924 "zoned": false, 00:28:40.924 "supported_io_types": { 00:28:40.924 "read": true, 00:28:40.924 "write": true, 00:28:40.924 "unmap": true, 00:28:40.924 "flush": false, 00:28:40.924 "reset": true, 00:28:40.924 "nvme_admin": false, 00:28:40.924 "nvme_io": false, 00:28:40.924 "nvme_io_md": false, 00:28:40.924 "write_zeroes": true, 00:28:40.924 "zcopy": false, 00:28:40.924 "get_zone_info": false, 00:28:40.924 "zone_management": false, 00:28:40.924 "zone_append": false, 00:28:40.924 "compare": false, 00:28:40.924 "compare_and_write": false, 00:28:40.924 "abort": false, 00:28:40.924 "seek_hole": true, 00:28:40.924 "seek_data": true, 00:28:40.924 "copy": false, 00:28:40.924 "nvme_iov_md": false 00:28:40.924 }, 00:28:40.924 "driver_specific": { 00:28:40.924 "lvol": { 00:28:40.924 "lvol_store_uuid": "bc25b133-1d95-4e95-a4c7-8a136344709e", 00:28:40.924 "base_bdev": "Nvme0n1", 00:28:40.924 "thin_provision": true, 00:28:40.924 "num_allocated_clusters": 0, 00:28:40.924 "snapshot": false, 00:28:40.924 "clone": false, 00:28:40.924 "esnap_clone": false 00:28:40.924 } 00:28:40.924 } 00:28:40.924 } 00:28:40.924 ] 00:28:40.924 08:04:25 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:40.924 08:04:25 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:28:40.924 08:04:25 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:28:41.184 [2024-07-15 08:04:25.693592] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:41.184 COMP_lvs0/lv0 00:28:41.184 08:04:25 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:41.184 08:04:25 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:28:41.184 08:04:25 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:41.184 08:04:25 compress_isal -- common/autotest_common.sh@899 -- # local i 00:28:41.184 08:04:25 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:41.184 08:04:25 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:41.184 08:04:25 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:41.184 08:04:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:41.446 [ 00:28:41.446 { 00:28:41.446 "name": "COMP_lvs0/lv0", 00:28:41.446 "aliases": [ 00:28:41.446 "109b5441-35d1-5f48-b3c1-c98c1ffcd964" 00:28:41.446 ], 00:28:41.446 "product_name": "compress", 00:28:41.446 "block_size": 512, 00:28:41.446 "num_blocks": 200704, 00:28:41.446 "uuid": "109b5441-35d1-5f48-b3c1-c98c1ffcd964", 00:28:41.446 "assigned_rate_limits": { 00:28:41.446 "rw_ios_per_sec": 0, 00:28:41.446 "rw_mbytes_per_sec": 0, 00:28:41.446 "r_mbytes_per_sec": 0, 00:28:41.446 "w_mbytes_per_sec": 0 00:28:41.446 }, 00:28:41.446 "claimed": false, 00:28:41.446 "zoned": false, 00:28:41.446 "supported_io_types": { 00:28:41.446 "read": true, 00:28:41.446 "write": true, 00:28:41.446 "unmap": false, 00:28:41.446 "flush": false, 00:28:41.446 "reset": false, 00:28:41.446 "nvme_admin": false, 00:28:41.446 "nvme_io": false, 00:28:41.446 "nvme_io_md": false, 00:28:41.446 "write_zeroes": true, 00:28:41.446 "zcopy": false, 00:28:41.446 "get_zone_info": false, 00:28:41.446 "zone_management": false, 00:28:41.446 "zone_append": false, 00:28:41.446 "compare": false, 00:28:41.446 "compare_and_write": false, 00:28:41.446 "abort": false, 00:28:41.446 "seek_hole": false, 00:28:41.446 "seek_data": false, 00:28:41.446 "copy": false, 00:28:41.446 "nvme_iov_md": false 00:28:41.446 }, 00:28:41.446 "driver_specific": { 00:28:41.446 "compress": { 00:28:41.446 "name": "COMP_lvs0/lv0", 00:28:41.446 "base_bdev_name": "fddb6c00-e150-459d-b29a-372d5598372a" 00:28:41.446 } 00:28:41.446 } 00:28:41.446 } 00:28:41.446 ] 00:28:41.446 08:04:26 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:28:41.446 08:04:26 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:41.708 I/O targets: 00:28:41.708 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:28:41.708 00:28:41.708 00:28:41.708 CUnit - A unit testing framework for C - Version 2.1-3 00:28:41.708 http://cunit.sourceforge.net/ 00:28:41.708 00:28:41.708 00:28:41.708 Suite: bdevio tests on: COMP_lvs0/lv0 00:28:41.708 Test: blockdev write read block ...passed 00:28:41.708 Test: blockdev write zeroes read block ...passed 00:28:41.708 Test: blockdev write zeroes read no split ...passed 00:28:41.708 Test: blockdev write zeroes read split ...passed 00:28:41.708 Test: blockdev write zeroes read split partial ...passed 00:28:41.708 Test: blockdev reset ...[2024-07-15 08:04:26.411838] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:28:41.708 passed 00:28:41.708 Test: blockdev write read 8 blocks ...passed 00:28:41.708 Test: blockdev write read size > 128k ...passed 00:28:41.708 Test: blockdev write read invalid size ...passed 00:28:41.708 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:41.708 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:41.708 Test: blockdev write read max offset ...passed 00:28:41.708 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:41.708 Test: blockdev writev readv 8 blocks ...passed 00:28:41.708 Test: blockdev writev readv 30 x 1block ...passed 00:28:41.708 Test: blockdev writev readv block ...passed 00:28:41.708 Test: blockdev writev readv size > 128k ...passed 00:28:41.708 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:41.708 Test: blockdev comparev and writev ...passed 00:28:41.708 Test: blockdev nvme passthru rw ...passed 00:28:41.708 Test: blockdev nvme passthru vendor specific ...passed 00:28:41.708 Test: blockdev nvme admin passthru ...passed 00:28:41.708 Test: blockdev copy ...passed 00:28:41.708 00:28:41.708 Run Summary: Type Total Ran Passed Failed Inactive 00:28:41.708 suites 1 1 n/a 0 0 00:28:41.708 tests 23 23 23 0 0 00:28:41.708 asserts 130 130 130 0 n/a 00:28:41.708 00:28:41.708 Elapsed time = 0.423 seconds 00:28:41.708 0 00:28:41.968 08:04:26 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:28:41.968 08:04:26 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:41.968 08:04:26 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:42.228 08:04:26 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:28:42.228 08:04:26 compress_isal -- compress/compress.sh@62 -- # killprocess 1788689 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1788689 ']' 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1788689 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@953 -- # uname 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1788689 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1788689' 00:28:42.228 killing process with pid 1788689 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@967 -- # kill 1788689 00:28:42.228 08:04:26 compress_isal -- common/autotest_common.sh@972 -- # wait 1788689 00:28:44.775 08:04:29 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:28:44.775 08:04:29 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:28:44.775 00:28:44.775 real 0m49.571s 00:28:44.775 user 1m53.219s 00:28:44.775 sys 0m3.390s 00:28:44.775 08:04:29 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:44.775 08:04:29 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:28:44.775 ************************************ 00:28:44.775 END TEST compress_isal 00:28:44.775 ************************************ 00:28:44.775 08:04:29 -- common/autotest_common.sh@1142 -- # return 0 00:28:44.775 08:04:29 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:28:44.775 08:04:29 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:28:44.775 08:04:29 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:28:44.775 08:04:29 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:28:44.775 08:04:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:44.775 08:04:29 -- common/autotest_common.sh@10 -- # set +x 00:28:44.775 ************************************ 00:28:44.775 START TEST blockdev_crypto_aesni 00:28:44.775 ************************************ 00:28:44.775 08:04:29 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:28:44.775 * Looking for test storage... 00:28:44.775 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1790394 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1790394 00:28:44.775 08:04:29 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 1790394 ']' 00:28:44.775 08:04:29 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:28:44.775 08:04:29 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:44.775 08:04:29 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:44.775 08:04:29 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:44.775 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:44.775 08:04:29 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:44.775 08:04:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:45.036 [2024-07-15 08:04:29.573007] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:28:45.036 [2024-07-15 08:04:29.573073] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1790394 ] 00:28:45.036 [2024-07-15 08:04:29.662785] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:45.036 [2024-07-15 08:04:29.730760] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:45.977 08:04:30 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:45.977 08:04:30 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:28:45.977 08:04:30 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:28:45.977 08:04:30 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:28:45.977 08:04:30 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:28:45.977 08:04:30 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:45.977 08:04:30 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:45.977 [2024-07-15 08:04:30.412701] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:45.977 [2024-07-15 08:04:30.420733] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:45.977 [2024-07-15 08:04:30.428751] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:45.977 [2024-07-15 08:04:30.478211] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:48.518 true 00:28:48.518 true 00:28:48.518 true 00:28:48.518 true 00:28:48.518 Malloc0 00:28:48.518 Malloc1 00:28:48.518 Malloc2 00:28:48.518 Malloc3 00:28:48.518 [2024-07-15 08:04:32.751826] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:48.518 crypto_ram 00:28:48.518 [2024-07-15 08:04:32.759847] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:48.518 crypto_ram2 00:28:48.518 [2024-07-15 08:04:32.767866] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:48.518 crypto_ram3 00:28:48.518 [2024-07-15 08:04:32.775887] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:48.518 crypto_ram4 00:28:48.518 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:48.518 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:28:48.518 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:48.518 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:48.518 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:48.518 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:28:48.518 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:28:48.518 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:48.518 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:48.518 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:48.518 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "994519de-4b7f-5b3e-b859-7d4c544f67ca"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "994519de-4b7f-5b3e-b859-7d4c544f67ca",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "72404fa0-335d-5dd6-b7d3-25f1477f4fc7"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "72404fa0-335d-5dd6-b7d3-25f1477f4fc7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "4aa0e128-94d3-5696-ab23-7bb37f3c441c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4aa0e128-94d3-5696-ab23-7bb37f3c441c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "4b776d03-5479-5748-8b95-d6e8c3b6f5fe"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4b776d03-5479-5748-8b95-d6e8c3b6f5fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:28:48.519 08:04:32 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 1790394 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 1790394 ']' 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 1790394 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:48.519 08:04:32 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1790394 00:28:48.519 08:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:48.519 08:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:48.519 08:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1790394' 00:28:48.519 killing process with pid 1790394 00:28:48.519 08:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 1790394 00:28:48.519 08:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 1790394 00:28:48.780 08:04:33 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:48.780 08:04:33 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:48.780 08:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:48.780 08:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:48.780 08:04:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:48.780 ************************************ 00:28:48.780 START TEST bdev_hello_world 00:28:48.780 ************************************ 00:28:48.780 08:04:33 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:48.780 [2024-07-15 08:04:33.429746] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:28:48.780 [2024-07-15 08:04:33.429790] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1791084 ] 00:28:48.780 [2024-07-15 08:04:33.517233] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:49.040 [2024-07-15 08:04:33.582241] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:49.040 [2024-07-15 08:04:33.603246] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:49.040 [2024-07-15 08:04:33.611268] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:49.040 [2024-07-15 08:04:33.619287] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:49.040 [2024-07-15 08:04:33.703198] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:51.585 [2024-07-15 08:04:35.866210] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:51.585 [2024-07-15 08:04:35.866259] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:51.585 [2024-07-15 08:04:35.866267] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.585 [2024-07-15 08:04:35.874228] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:51.585 [2024-07-15 08:04:35.874239] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:51.585 [2024-07-15 08:04:35.874244] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.585 [2024-07-15 08:04:35.882248] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:51.585 [2024-07-15 08:04:35.882258] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:51.585 [2024-07-15 08:04:35.882264] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.585 [2024-07-15 08:04:35.890268] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:51.585 [2024-07-15 08:04:35.890278] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:51.585 [2024-07-15 08:04:35.890283] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:51.585 [2024-07-15 08:04:35.951525] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:28:51.585 [2024-07-15 08:04:35.951555] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:28:51.585 [2024-07-15 08:04:35.951566] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:28:51.585 [2024-07-15 08:04:35.952619] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:28:51.585 [2024-07-15 08:04:35.952672] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:28:51.585 [2024-07-15 08:04:35.952681] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:28:51.585 [2024-07-15 08:04:35.952718] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:28:51.585 00:28:51.585 [2024-07-15 08:04:35.952729] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:28:51.585 00:28:51.585 real 0m2.806s 00:28:51.585 user 0m2.541s 00:28:51.585 sys 0m0.234s 00:28:51.585 08:04:36 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:51.585 08:04:36 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:28:51.585 ************************************ 00:28:51.585 END TEST bdev_hello_world 00:28:51.585 ************************************ 00:28:51.585 08:04:36 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:28:51.585 08:04:36 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:28:51.585 08:04:36 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:28:51.585 08:04:36 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:51.585 08:04:36 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:51.585 ************************************ 00:28:51.585 START TEST bdev_bounds 00:28:51.585 ************************************ 00:28:51.585 08:04:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:28:51.585 08:04:36 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1791549 00:28:51.585 08:04:36 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:28:51.585 08:04:36 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1791549' 00:28:51.585 Process bdevio pid: 1791549 00:28:51.586 08:04:36 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:51.586 08:04:36 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1791549 00:28:51.586 08:04:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1791549 ']' 00:28:51.586 08:04:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:51.586 08:04:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:51.586 08:04:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:51.586 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:51.586 08:04:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:51.586 08:04:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:51.586 [2024-07-15 08:04:36.314220] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:28:51.586 [2024-07-15 08:04:36.314264] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1791549 ] 00:28:51.846 [2024-07-15 08:04:36.400858] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:51.846 [2024-07-15 08:04:36.470730] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:51.846 [2024-07-15 08:04:36.471066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:51.846 [2024-07-15 08:04:36.471066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:51.846 [2024-07-15 08:04:36.492071] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:51.846 [2024-07-15 08:04:36.500099] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:51.846 [2024-07-15 08:04:36.508117] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:51.846 [2024-07-15 08:04:36.592374] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:54.409 [2024-07-15 08:04:38.762184] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:54.409 [2024-07-15 08:04:38.762246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:54.409 [2024-07-15 08:04:38.762256] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:54.409 [2024-07-15 08:04:38.770199] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:54.409 [2024-07-15 08:04:38.770211] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:54.409 [2024-07-15 08:04:38.770222] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:54.409 [2024-07-15 08:04:38.778218] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:54.409 [2024-07-15 08:04:38.778229] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:54.409 [2024-07-15 08:04:38.778234] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:54.409 [2024-07-15 08:04:38.786239] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:54.409 [2024-07-15 08:04:38.786249] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:54.409 [2024-07-15 08:04:38.786255] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:54.409 08:04:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:54.409 08:04:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:28:54.409 08:04:38 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:54.409 I/O targets: 00:28:54.409 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:28:54.409 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:28:54.409 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:28:54.409 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:28:54.409 00:28:54.409 00:28:54.409 CUnit - A unit testing framework for C - Version 2.1-3 00:28:54.409 http://cunit.sourceforge.net/ 00:28:54.409 00:28:54.409 00:28:54.409 Suite: bdevio tests on: crypto_ram4 00:28:54.409 Test: blockdev write read block ...passed 00:28:54.409 Test: blockdev write zeroes read block ...passed 00:28:54.409 Test: blockdev write zeroes read no split ...passed 00:28:54.409 Test: blockdev write zeroes read split ...passed 00:28:54.409 Test: blockdev write zeroes read split partial ...passed 00:28:54.409 Test: blockdev reset ...passed 00:28:54.409 Test: blockdev write read 8 blocks ...passed 00:28:54.409 Test: blockdev write read size > 128k ...passed 00:28:54.409 Test: blockdev write read invalid size ...passed 00:28:54.409 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:54.409 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:54.409 Test: blockdev write read max offset ...passed 00:28:54.409 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:54.409 Test: blockdev writev readv 8 blocks ...passed 00:28:54.409 Test: blockdev writev readv 30 x 1block ...passed 00:28:54.409 Test: blockdev writev readv block ...passed 00:28:54.409 Test: blockdev writev readv size > 128k ...passed 00:28:54.409 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:54.409 Test: blockdev comparev and writev ...passed 00:28:54.409 Test: blockdev nvme passthru rw ...passed 00:28:54.409 Test: blockdev nvme passthru vendor specific ...passed 00:28:54.409 Test: blockdev nvme admin passthru ...passed 00:28:54.409 Test: blockdev copy ...passed 00:28:54.409 Suite: bdevio tests on: crypto_ram3 00:28:54.409 Test: blockdev write read block ...passed 00:28:54.409 Test: blockdev write zeroes read block ...passed 00:28:54.409 Test: blockdev write zeroes read no split ...passed 00:28:54.409 Test: blockdev write zeroes read split ...passed 00:28:54.409 Test: blockdev write zeroes read split partial ...passed 00:28:54.409 Test: blockdev reset ...passed 00:28:54.410 Test: blockdev write read 8 blocks ...passed 00:28:54.410 Test: blockdev write read size > 128k ...passed 00:28:54.410 Test: blockdev write read invalid size ...passed 00:28:54.410 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:54.410 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:54.410 Test: blockdev write read max offset ...passed 00:28:54.410 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:54.410 Test: blockdev writev readv 8 blocks ...passed 00:28:54.410 Test: blockdev writev readv 30 x 1block ...passed 00:28:54.410 Test: blockdev writev readv block ...passed 00:28:54.410 Test: blockdev writev readv size > 128k ...passed 00:28:54.410 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:54.410 Test: blockdev comparev and writev ...passed 00:28:54.410 Test: blockdev nvme passthru rw ...passed 00:28:54.410 Test: blockdev nvme passthru vendor specific ...passed 00:28:54.410 Test: blockdev nvme admin passthru ...passed 00:28:54.410 Test: blockdev copy ...passed 00:28:54.410 Suite: bdevio tests on: crypto_ram2 00:28:54.410 Test: blockdev write read block ...passed 00:28:54.410 Test: blockdev write zeroes read block ...passed 00:28:54.671 Test: blockdev write zeroes read no split ...passed 00:28:54.671 Test: blockdev write zeroes read split ...passed 00:28:54.932 Test: blockdev write zeroes read split partial ...passed 00:28:54.932 Test: blockdev reset ...passed 00:28:54.932 Test: blockdev write read 8 blocks ...passed 00:28:54.932 Test: blockdev write read size > 128k ...passed 00:28:54.932 Test: blockdev write read invalid size ...passed 00:28:54.932 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:54.932 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:54.932 Test: blockdev write read max offset ...passed 00:28:54.932 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:54.932 Test: blockdev writev readv 8 blocks ...passed 00:28:54.932 Test: blockdev writev readv 30 x 1block ...passed 00:28:54.932 Test: blockdev writev readv block ...passed 00:28:54.932 Test: blockdev writev readv size > 128k ...passed 00:28:54.932 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:54.932 Test: blockdev comparev and writev ...passed 00:28:54.932 Test: blockdev nvme passthru rw ...passed 00:28:54.932 Test: blockdev nvme passthru vendor specific ...passed 00:28:54.932 Test: blockdev nvme admin passthru ...passed 00:28:54.932 Test: blockdev copy ...passed 00:28:54.932 Suite: bdevio tests on: crypto_ram 00:28:54.932 Test: blockdev write read block ...passed 00:28:54.932 Test: blockdev write zeroes read block ...passed 00:28:54.932 Test: blockdev write zeroes read no split ...passed 00:28:55.193 Test: blockdev write zeroes read split ...passed 00:28:55.193 Test: blockdev write zeroes read split partial ...passed 00:28:55.193 Test: blockdev reset ...passed 00:28:55.193 Test: blockdev write read 8 blocks ...passed 00:28:55.193 Test: blockdev write read size > 128k ...passed 00:28:55.193 Test: blockdev write read invalid size ...passed 00:28:55.193 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:55.193 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:55.193 Test: blockdev write read max offset ...passed 00:28:55.193 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:55.193 Test: blockdev writev readv 8 blocks ...passed 00:28:55.193 Test: blockdev writev readv 30 x 1block ...passed 00:28:55.193 Test: blockdev writev readv block ...passed 00:28:55.193 Test: blockdev writev readv size > 128k ...passed 00:28:55.193 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:55.193 Test: blockdev comparev and writev ...passed 00:28:55.193 Test: blockdev nvme passthru rw ...passed 00:28:55.193 Test: blockdev nvme passthru vendor specific ...passed 00:28:55.193 Test: blockdev nvme admin passthru ...passed 00:28:55.193 Test: blockdev copy ...passed 00:28:55.193 00:28:55.193 Run Summary: Type Total Ran Passed Failed Inactive 00:28:55.193 suites 4 4 n/a 0 0 00:28:55.193 tests 92 92 92 0 0 00:28:55.193 asserts 520 520 520 0 n/a 00:28:55.193 00:28:55.193 Elapsed time = 1.870 seconds 00:28:55.193 0 00:28:55.193 08:04:39 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1791549 00:28:55.193 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1791549 ']' 00:28:55.193 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1791549 00:28:55.193 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:28:55.193 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:55.454 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1791549 00:28:55.454 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:55.454 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:55.454 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1791549' 00:28:55.454 killing process with pid 1791549 00:28:55.454 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1791549 00:28:55.454 08:04:39 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1791549 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:28:55.715 00:28:55.715 real 0m3.976s 00:28:55.715 user 0m10.762s 00:28:55.715 sys 0m0.395s 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:55.715 ************************************ 00:28:55.715 END TEST bdev_bounds 00:28:55.715 ************************************ 00:28:55.715 08:04:40 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:28:55.715 08:04:40 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:28:55.715 08:04:40 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:55.715 08:04:40 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:55.715 08:04:40 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:28:55.715 ************************************ 00:28:55.715 START TEST bdev_nbd 00:28:55.715 ************************************ 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1792196 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1792196 /var/tmp/spdk-nbd.sock 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1792196 ']' 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:28:55.715 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:55.715 08:04:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:55.715 [2024-07-15 08:04:40.370619] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:28:55.715 [2024-07-15 08:04:40.370662] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:55.715 [2024-07-15 08:04:40.457498] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:55.974 [2024-07-15 08:04:40.525246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:55.974 [2024-07-15 08:04:40.546258] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:28:55.974 [2024-07-15 08:04:40.554281] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:55.974 [2024-07-15 08:04:40.562298] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:55.974 [2024-07-15 08:04:40.645667] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:28:58.513 [2024-07-15 08:04:42.812092] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:28:58.513 [2024-07-15 08:04:42.812146] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:58.513 [2024-07-15 08:04:42.812155] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:58.513 [2024-07-15 08:04:42.820110] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:28:58.513 [2024-07-15 08:04:42.820121] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:58.513 [2024-07-15 08:04:42.820127] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:58.513 [2024-07-15 08:04:42.828129] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:28:58.513 [2024-07-15 08:04:42.828141] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:58.513 [2024-07-15 08:04:42.828146] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:58.513 [2024-07-15 08:04:42.836149] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:28:58.513 [2024-07-15 08:04:42.836159] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:58.513 [2024-07-15 08:04:42.836166] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:58.513 08:04:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:58.513 08:04:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:28:58.513 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:28:58.513 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:58.513 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:58.514 08:04:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:58.514 1+0 records in 00:28:58.514 1+0 records out 00:28:58.514 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273081 s, 15.0 MB/s 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:58.514 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:58.773 1+0 records in 00:28:58.773 1+0 records out 00:28:58.773 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271971 s, 15.1 MB/s 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:58.773 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:59.033 1+0 records in 00:28:59.033 1+0 records out 00:28:59.033 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256971 s, 15.9 MB/s 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:59.033 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:59.293 1+0 records in 00:28:59.293 1+0 records out 00:28:59.293 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000208265 s, 19.7 MB/s 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:59.293 08:04:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:59.293 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:28:59.293 { 00:28:59.293 "nbd_device": "/dev/nbd0", 00:28:59.293 "bdev_name": "crypto_ram" 00:28:59.293 }, 00:28:59.293 { 00:28:59.293 "nbd_device": "/dev/nbd1", 00:28:59.293 "bdev_name": "crypto_ram2" 00:28:59.293 }, 00:28:59.293 { 00:28:59.293 "nbd_device": "/dev/nbd2", 00:28:59.293 "bdev_name": "crypto_ram3" 00:28:59.293 }, 00:28:59.293 { 00:28:59.293 "nbd_device": "/dev/nbd3", 00:28:59.293 "bdev_name": "crypto_ram4" 00:28:59.293 } 00:28:59.293 ]' 00:28:59.293 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:28:59.293 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:28:59.293 { 00:28:59.293 "nbd_device": "/dev/nbd0", 00:28:59.293 "bdev_name": "crypto_ram" 00:28:59.293 }, 00:28:59.293 { 00:28:59.293 "nbd_device": "/dev/nbd1", 00:28:59.293 "bdev_name": "crypto_ram2" 00:28:59.293 }, 00:28:59.293 { 00:28:59.293 "nbd_device": "/dev/nbd2", 00:28:59.293 "bdev_name": "crypto_ram3" 00:28:59.293 }, 00:28:59.293 { 00:28:59.293 "nbd_device": "/dev/nbd3", 00:28:59.293 "bdev_name": "crypto_ram4" 00:28:59.293 } 00:28:59.293 ]' 00:28:59.293 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:59.554 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:59.814 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:00.101 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:00.361 08:04:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:00.361 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:29:00.622 /dev/nbd0 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:00.622 1+0 records in 00:29:00.622 1+0 records out 00:29:00.622 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287862 s, 14.2 MB/s 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:00.622 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:29:00.883 /dev/nbd1 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:00.883 1+0 records in 00:29:00.883 1+0 records out 00:29:00.883 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267457 s, 15.3 MB/s 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:00.883 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:29:01.144 /dev/nbd10 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:01.144 1+0 records in 00:29:01.144 1+0 records out 00:29:01.144 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280887 s, 14.6 MB/s 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:01.144 08:04:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:29:01.405 /dev/nbd11 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:01.405 1+0 records in 00:29:01.405 1+0 records out 00:29:01.405 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289803 s, 14.1 MB/s 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:01.405 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:29:01.666 { 00:29:01.666 "nbd_device": "/dev/nbd0", 00:29:01.666 "bdev_name": "crypto_ram" 00:29:01.666 }, 00:29:01.666 { 00:29:01.666 "nbd_device": "/dev/nbd1", 00:29:01.666 "bdev_name": "crypto_ram2" 00:29:01.666 }, 00:29:01.666 { 00:29:01.666 "nbd_device": "/dev/nbd10", 00:29:01.666 "bdev_name": "crypto_ram3" 00:29:01.666 }, 00:29:01.666 { 00:29:01.666 "nbd_device": "/dev/nbd11", 00:29:01.666 "bdev_name": "crypto_ram4" 00:29:01.666 } 00:29:01.666 ]' 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:29:01.666 { 00:29:01.666 "nbd_device": "/dev/nbd0", 00:29:01.666 "bdev_name": "crypto_ram" 00:29:01.666 }, 00:29:01.666 { 00:29:01.666 "nbd_device": "/dev/nbd1", 00:29:01.666 "bdev_name": "crypto_ram2" 00:29:01.666 }, 00:29:01.666 { 00:29:01.666 "nbd_device": "/dev/nbd10", 00:29:01.666 "bdev_name": "crypto_ram3" 00:29:01.666 }, 00:29:01.666 { 00:29:01.666 "nbd_device": "/dev/nbd11", 00:29:01.666 "bdev_name": "crypto_ram4" 00:29:01.666 } 00:29:01.666 ]' 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:29:01.666 /dev/nbd1 00:29:01.666 /dev/nbd10 00:29:01.666 /dev/nbd11' 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:29:01.666 /dev/nbd1 00:29:01.666 /dev/nbd10 00:29:01.666 /dev/nbd11' 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:29:01.666 256+0 records in 00:29:01.666 256+0 records out 00:29:01.666 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0119959 s, 87.4 MB/s 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:29:01.666 256+0 records in 00:29:01.666 256+0 records out 00:29:01.666 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0521679 s, 20.1 MB/s 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:01.666 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:29:01.927 256+0 records in 00:29:01.927 256+0 records out 00:29:01.927 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0464412 s, 22.6 MB/s 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:29:01.927 256+0 records in 00:29:01.927 256+0 records out 00:29:01.927 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.03894 s, 26.9 MB/s 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:29:01.927 256+0 records in 00:29:01.927 256+0 records out 00:29:01.927 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0452207 s, 23.2 MB/s 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:29:01.927 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:01.928 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:02.189 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:02.451 08:04:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:02.451 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:02.710 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:29:02.970 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:29:03.229 malloc_lvol_verify 00:29:03.229 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:29:03.229 60ac9818-365c-43dc-8e39-947ee55f805b 00:29:03.229 08:04:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:29:03.488 ce6ee66b-264e-42a5-9fc5-6b786252ba90 00:29:03.488 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:29:03.747 /dev/nbd0 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:29:03.747 mke2fs 1.46.5 (30-Dec-2021) 00:29:03.747 Discarding device blocks: 0/4096 done 00:29:03.747 Creating filesystem with 4096 1k blocks and 1024 inodes 00:29:03.747 00:29:03.747 Allocating group tables: 0/1 done 00:29:03.747 Writing inode tables: 0/1 done 00:29:03.747 Creating journal (1024 blocks): done 00:29:03.747 Writing superblocks and filesystem accounting information: 0/1 done 00:29:03.747 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:03.747 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1792196 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1792196 ']' 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1792196 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1792196 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1792196' 00:29:04.007 killing process with pid 1792196 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1792196 00:29:04.007 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1792196 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:29:04.267 00:29:04.267 real 0m8.583s 00:29:04.267 user 0m11.864s 00:29:04.267 sys 0m2.395s 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:04.267 ************************************ 00:29:04.267 END TEST bdev_nbd 00:29:04.267 ************************************ 00:29:04.267 08:04:48 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:29:04.267 08:04:48 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:29:04.267 08:04:48 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:29:04.267 08:04:48 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:29:04.267 08:04:48 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:29:04.267 08:04:48 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:04.267 08:04:48 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:04.267 08:04:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:04.267 ************************************ 00:29:04.267 START TEST bdev_fio 00:29:04.267 ************************************ 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:04.267 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:29:04.267 08:04:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:04.528 ************************************ 00:29:04.528 START TEST bdev_fio_rw_verify 00:29:04.528 ************************************ 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:04.528 08:04:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:04.788 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:04.788 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:04.788 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:04.788 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:04.788 fio-3.35 00:29:04.788 Starting 4 threads 00:29:19.692 00:29:19.692 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1794467: Mon Jul 15 08:05:02 2024 00:29:19.692 read: IOPS=32.4k, BW=126MiB/s (133MB/s)(1265MiB/10001msec) 00:29:19.692 slat (usec): min=14, max=359, avg=40.04, stdev=27.08 00:29:19.692 clat (usec): min=8, max=2121, avg=215.58, stdev=153.21 00:29:19.692 lat (usec): min=23, max=2284, avg=255.62, stdev=169.26 00:29:19.692 clat percentiles (usec): 00:29:19.692 | 50.000th=[ 174], 99.000th=[ 701], 99.900th=[ 996], 99.990th=[ 1418], 00:29:19.692 | 99.999th=[ 2008] 00:29:19.692 write: IOPS=35.6k, BW=139MiB/s (146MB/s)(1354MiB/9726msec); 0 zone resets 00:29:19.692 slat (usec): min=15, max=478, avg=50.02, stdev=26.19 00:29:19.692 clat (usec): min=23, max=1750, avg=283.15, stdev=189.98 00:29:19.692 lat (usec): min=47, max=1875, avg=333.17, stdev=206.08 00:29:19.692 clat percentiles (usec): 00:29:19.692 | 50.000th=[ 243], 99.000th=[ 857], 99.900th=[ 1172], 99.990th=[ 1385], 00:29:19.692 | 99.999th=[ 1598] 00:29:19.692 bw ( KiB/s): min=91984, max=155680, per=97.12%, avg=138459.37, stdev=4083.73, samples=76 00:29:19.692 iops : min=22996, max=38920, avg=34614.79, stdev=1020.94, samples=76 00:29:19.692 lat (usec) : 10=0.01%, 20=0.01%, 50=3.32%, 100=14.08%, 250=44.98% 00:29:19.692 lat (usec) : 500=27.17%, 750=8.74%, 1000=1.47% 00:29:19.692 lat (msec) : 2=0.24%, 4=0.01% 00:29:19.692 cpu : usr=99.69%, sys=0.00%, ctx=48, majf=0, minf=229 00:29:19.692 IO depths : 1=10.4%, 2=23.6%, 4=52.7%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:19.692 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:19.692 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:19.692 issued rwts: total=323775,346643,0,0 short=0,0,0,0 dropped=0,0,0,0 00:29:19.692 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:19.692 00:29:19.692 Run status group 0 (all jobs): 00:29:19.692 READ: bw=126MiB/s (133MB/s), 126MiB/s-126MiB/s (133MB/s-133MB/s), io=1265MiB (1326MB), run=10001-10001msec 00:29:19.692 WRITE: bw=139MiB/s (146MB/s), 139MiB/s-139MiB/s (146MB/s-146MB/s), io=1354MiB (1420MB), run=9726-9726msec 00:29:19.692 00:29:19.692 real 0m13.351s 00:29:19.692 user 0m49.847s 00:29:19.692 sys 0m0.390s 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:29:19.692 ************************************ 00:29:19.692 END TEST bdev_fio_rw_verify 00:29:19.692 ************************************ 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:19.692 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "994519de-4b7f-5b3e-b859-7d4c544f67ca"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "994519de-4b7f-5b3e-b859-7d4c544f67ca",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "72404fa0-335d-5dd6-b7d3-25f1477f4fc7"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "72404fa0-335d-5dd6-b7d3-25f1477f4fc7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "4aa0e128-94d3-5696-ab23-7bb37f3c441c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4aa0e128-94d3-5696-ab23-7bb37f3c441c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "4b776d03-5479-5748-8b95-d6e8c3b6f5fe"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4b776d03-5479-5748-8b95-d6e8c3b6f5fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:29:19.693 crypto_ram2 00:29:19.693 crypto_ram3 00:29:19.693 crypto_ram4 ]] 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "994519de-4b7f-5b3e-b859-7d4c544f67ca"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "994519de-4b7f-5b3e-b859-7d4c544f67ca",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "72404fa0-335d-5dd6-b7d3-25f1477f4fc7"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "72404fa0-335d-5dd6-b7d3-25f1477f4fc7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "4aa0e128-94d3-5696-ab23-7bb37f3c441c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4aa0e128-94d3-5696-ab23-7bb37f3c441c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "4b776d03-5479-5748-8b95-d6e8c3b6f5fe"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4b776d03-5479-5748-8b95-d6e8c3b6f5fe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:19.693 ************************************ 00:29:19.693 START TEST bdev_fio_trim 00:29:19.693 ************************************ 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:19.693 08:05:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:19.693 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:19.693 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:19.693 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:19.693 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:19.693 fio-3.35 00:29:19.693 Starting 4 threads 00:29:31.919 00:29:31.919 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1796968: Mon Jul 15 08:05:15 2024 00:29:31.919 write: IOPS=51.1k, BW=200MiB/s (209MB/s)(1998MiB/10001msec); 0 zone resets 00:29:31.919 slat (usec): min=13, max=466, avg=47.03, stdev=36.11 00:29:31.919 clat (usec): min=34, max=1704, avg=223.35, stdev=199.05 00:29:31.919 lat (usec): min=48, max=1897, avg=270.38, stdev=227.92 00:29:31.919 clat percentiles (usec): 00:29:31.919 | 50.000th=[ 157], 99.000th=[ 1074], 99.900th=[ 1172], 99.990th=[ 1270], 00:29:31.919 | 99.999th=[ 1532] 00:29:31.919 bw ( KiB/s): min=155112, max=242752, per=100.00%, avg=204833.68, stdev=7136.01, samples=76 00:29:31.919 iops : min=38778, max=60688, avg=51208.42, stdev=1784.00, samples=76 00:29:31.919 trim: IOPS=51.1k, BW=200MiB/s (209MB/s)(1998MiB/10001msec); 0 zone resets 00:29:31.919 slat (usec): min=4, max=425, avg= 9.70, stdev= 5.44 00:29:31.919 clat (usec): min=41, max=1263, avg=192.35, stdev=111.53 00:29:31.919 lat (usec): min=51, max=1283, avg=202.05, stdev=114.14 00:29:31.919 clat percentiles (usec): 00:29:31.919 | 50.000th=[ 167], 99.000th=[ 693], 99.900th=[ 758], 99.990th=[ 848], 00:29:31.919 | 99.999th=[ 988] 00:29:31.919 bw ( KiB/s): min=155112, max=242752, per=100.00%, avg=204835.37, stdev=7136.39, samples=76 00:29:31.919 iops : min=38778, max=60688, avg=51208.84, stdev=1784.10, samples=76 00:29:31.919 lat (usec) : 50=2.51%, 100=17.03%, 250=56.65%, 500=18.45%, 750=3.44% 00:29:31.919 lat (usec) : 1000=1.12% 00:29:31.919 lat (msec) : 2=0.79% 00:29:31.919 cpu : usr=99.67%, sys=0.00%, ctx=80, majf=0, minf=109 00:29:31.919 IO depths : 1=8.5%, 2=22.4%, 4=55.3%, 8=13.8%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:31.919 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:31.919 complete : 0=0.0%, 4=87.9%, 8=12.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:31.919 issued rwts: total=0,511399,511400,0 short=0,0,0,0 dropped=0,0,0,0 00:29:31.919 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:31.919 00:29:31.919 Run status group 0 (all jobs): 00:29:31.919 WRITE: bw=200MiB/s (209MB/s), 200MiB/s-200MiB/s (209MB/s-209MB/s), io=1998MiB (2095MB), run=10001-10001msec 00:29:31.919 TRIM: bw=200MiB/s (209MB/s), 200MiB/s-200MiB/s (209MB/s-209MB/s), io=1998MiB (2095MB), run=10001-10001msec 00:29:31.919 00:29:31.919 real 0m13.551s 00:29:31.919 user 0m50.005s 00:29:31.919 sys 0m0.509s 00:29:31.919 08:05:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:31.919 08:05:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:29:31.919 ************************************ 00:29:31.919 END TEST bdev_fio_trim 00:29:31.919 ************************************ 00:29:31.919 08:05:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:29:31.919 08:05:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:29:31.919 08:05:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:31.920 08:05:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:29:31.920 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:31.920 08:05:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:29:31.920 00:29:31.920 real 0m27.255s 00:29:31.920 user 1m40.045s 00:29:31.920 sys 0m1.075s 00:29:31.920 08:05:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:31.920 08:05:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:31.920 ************************************ 00:29:31.920 END TEST bdev_fio 00:29:31.920 ************************************ 00:29:31.920 08:05:16 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:29:31.920 08:05:16 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:31.920 08:05:16 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:31.920 08:05:16 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:29:31.920 08:05:16 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:31.920 08:05:16 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:31.920 ************************************ 00:29:31.920 START TEST bdev_verify 00:29:31.920 ************************************ 00:29:31.920 08:05:16 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:31.920 [2024-07-15 08:05:16.366066] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:31.920 [2024-07-15 08:05:16.366120] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1798572 ] 00:29:31.920 [2024-07-15 08:05:16.455396] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:31.920 [2024-07-15 08:05:16.524767] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:31.920 [2024-07-15 08:05:16.524957] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:31.920 [2024-07-15 08:05:16.545976] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:29:31.920 [2024-07-15 08:05:16.554004] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:31.920 [2024-07-15 08:05:16.562027] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:31.920 [2024-07-15 08:05:16.649388] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:29:34.456 [2024-07-15 08:05:18.809708] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:29:34.456 [2024-07-15 08:05:18.809765] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:34.456 [2024-07-15 08:05:18.809773] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.456 [2024-07-15 08:05:18.817727] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:29:34.456 [2024-07-15 08:05:18.817738] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:34.456 [2024-07-15 08:05:18.817743] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.456 [2024-07-15 08:05:18.825750] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:29:34.456 [2024-07-15 08:05:18.825760] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:34.456 [2024-07-15 08:05:18.825765] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.456 [2024-07-15 08:05:18.833768] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:34.456 [2024-07-15 08:05:18.833777] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:34.456 [2024-07-15 08:05:18.833783] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.456 Running I/O for 5 seconds... 00:29:39.743 00:29:39.743 Latency(us) 00:29:39.743 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:39.743 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:39.743 Verification LBA range: start 0x0 length 0x1000 00:29:39.743 crypto_ram : 5.06 581.88 2.27 0.00 0.00 219555.55 7057.72 129862.10 00:29:39.743 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:39.743 Verification LBA range: start 0x1000 length 0x1000 00:29:39.743 crypto_ram : 5.06 480.47 1.88 0.00 0.00 265812.12 9023.80 158899.59 00:29:39.743 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:39.743 Verification LBA range: start 0x0 length 0x1000 00:29:39.743 crypto_ram2 : 5.06 581.77 2.27 0.00 0.00 218949.44 7461.02 124215.93 00:29:39.743 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:39.743 Verification LBA range: start 0x1000 length 0x1000 00:29:39.743 crypto_ram2 : 5.06 480.37 1.88 0.00 0.00 264955.19 9830.40 145187.45 00:29:39.743 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:39.743 Verification LBA range: start 0x0 length 0x1000 00:29:39.743 crypto_ram3 : 5.05 4547.13 17.76 0.00 0.00 27885.57 2369.38 25105.33 00:29:39.743 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:39.743 Verification LBA range: start 0x1000 length 0x1000 00:29:39.743 crypto_ram3 : 5.05 3748.95 14.64 0.00 0.00 33809.59 4032.98 28029.24 00:29:39.743 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:39.743 Verification LBA range: start 0x0 length 0x1000 00:29:39.743 crypto_ram4 : 5.05 4556.77 17.80 0.00 0.00 27789.37 2079.51 23996.26 00:29:39.743 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:39.743 Verification LBA range: start 0x1000 length 0x1000 00:29:39.743 crypto_ram4 : 5.06 3746.67 14.64 0.00 0.00 33741.91 4663.14 27424.30 00:29:39.743 =================================================================================================================== 00:29:39.743 Total : 18724.00 73.14 0.00 0.00 54336.14 2079.51 158899.59 00:29:39.743 00:29:39.743 real 0m7.926s 00:29:39.743 user 0m15.254s 00:29:39.743 sys 0m0.254s 00:29:39.743 08:05:24 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:39.743 08:05:24 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:29:39.743 ************************************ 00:29:39.743 END TEST bdev_verify 00:29:39.743 ************************************ 00:29:39.744 08:05:24 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:29:39.744 08:05:24 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:39.744 08:05:24 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:29:39.744 08:05:24 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:39.744 08:05:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:39.744 ************************************ 00:29:39.744 START TEST bdev_verify_big_io 00:29:39.744 ************************************ 00:29:39.744 08:05:24 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:39.744 [2024-07-15 08:05:24.372053] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:39.744 [2024-07-15 08:05:24.372105] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1800005 ] 00:29:39.744 [2024-07-15 08:05:24.462386] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:40.003 [2024-07-15 08:05:24.537891] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:40.003 [2024-07-15 08:05:24.537977] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:40.003 [2024-07-15 08:05:24.559011] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:29:40.003 [2024-07-15 08:05:24.567038] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:40.003 [2024-07-15 08:05:24.575060] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:40.003 [2024-07-15 08:05:24.662365] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:29:42.546 [2024-07-15 08:05:26.831075] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:29:42.546 [2024-07-15 08:05:26.831134] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:42.546 [2024-07-15 08:05:26.831142] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.546 [2024-07-15 08:05:26.839091] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:29:42.546 [2024-07-15 08:05:26.839102] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:42.546 [2024-07-15 08:05:26.839107] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.546 [2024-07-15 08:05:26.847115] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:29:42.546 [2024-07-15 08:05:26.847125] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:42.546 [2024-07-15 08:05:26.847130] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.546 [2024-07-15 08:05:26.855134] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:42.546 [2024-07-15 08:05:26.855143] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:42.546 [2024-07-15 08:05:26.855149] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.546 Running I/O for 5 seconds... 00:29:43.122 [2024-07-15 08:05:27.688161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.688567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.688694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.688746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.688784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.689171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.689184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.690864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.690915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.690952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.690989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.691528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.691571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.691619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.691655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.692089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.692100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.693323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.693369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.693414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.693450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.693952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.693992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.694029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.694065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.694375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.694386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.695394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.695438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.695474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.695515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.695943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.695983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.696021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.122 [2024-07-15 08:05:27.696058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.696318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.696328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.698906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.699904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.699947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.699983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.700019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.700551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.700589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.700626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.701051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.701961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.702004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.702041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.702076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.702509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.702548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.702592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.702628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.702895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.703852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.703909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.703945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.703981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.704399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.704437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.704474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.704510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.704889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.705786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.705832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.705868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.705904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.706295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.706337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.706373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.706408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.706696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.707693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.707742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.707779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.707815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.708154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.708207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.708243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.708279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.708648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.709625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.709669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.709707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.709749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.710152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.710190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.710226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.710262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.710545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.711523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.711566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.711602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.711638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.711993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.712032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.712068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.712111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.712583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.713734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.123 [2024-07-15 08:05:27.713781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.713817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.713852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.714193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.714244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.714280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.714316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.714656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.715493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.715536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.715572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.715608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.715983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.716023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.716067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.716104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.716622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.718970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.719829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.719879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.719915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.719951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.720330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.720369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.720404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.720439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.720837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.721788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.721838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.721874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.721910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.722313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.722355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.722391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.722430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.722772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.723758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.723800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.723836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.723872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.724212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.724250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.724286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.724321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.724613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.725611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.725653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.725690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.725731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.726070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.726109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.726145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.726181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.726471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.727432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.727474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.727509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.727545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.727988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.728028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.728063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.728099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.728359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.124 [2024-07-15 08:05:27.730039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.730084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.730124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.730159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.730560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.730605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.730640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.730684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.730949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.731801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.731844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.731879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.731915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.732324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.732362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.732398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.732433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.732755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.734143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.734186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.734222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.734259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.734662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.734699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.734740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.734776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.735034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.735857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.735900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.735936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.735971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.736369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.736412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.736448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.736483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.736821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.737945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.737991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.738027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.738062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.738427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.738468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.738505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.738540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.738852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.739842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.739898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.739934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.739970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.740307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.740346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.740382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.740417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.740697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.741705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.741753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.741789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.741825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.742164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.742202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.742238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.742273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.742578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.743569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.743612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.743648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.743683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.744057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.744095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.744131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.744166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.744424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.745942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.745988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.125 [2024-07-15 08:05:27.746024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.746060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.746401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.746443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.746479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.746515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.746781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.747614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.747654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.747690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.747731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.748113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.748151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.748187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.748223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.748524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.749626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.749679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.749720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.749755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.750202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.750240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.750276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.750311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.750619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.751562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.751604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.751640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.751676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.752024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.752073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.752108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.752144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.752461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.753383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.753768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.755289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.756855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.757292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.759025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.760392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.761946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.762245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.763614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.765363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.767098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.768785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.770726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.772404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.774129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.775858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.776261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.778975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.780556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.781847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.783460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.785420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.786979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.787361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.788155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.788452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.790605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.792163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.793740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.795310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.796070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.797447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.799010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.800582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.800888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.803421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.805004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.805498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.805873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.807943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.809703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.126 [2024-07-15 08:05:27.811456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.812951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.813285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.814552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.815011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.816572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.818141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.819912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.821593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.823230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.824828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.825090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.828397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.829976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.831545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.832836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.834911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.836636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.838302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.838672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.839119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.841524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.842707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.844263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.845854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.847441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.847821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.848996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.850550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.850870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.853398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.854971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.856545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.857252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.859108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.860679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.862271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.863907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.864233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.866770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.867241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.867612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.869363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.871372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.872960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.127 [2024-07-15 08:05:27.874711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.876431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.876696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.878274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.879826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.881387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.882953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.885089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.886863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.888581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.890254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.890760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.893595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.895179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.896609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.898363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.900384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.392 [2024-07-15 08:05:27.901997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.902372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.903125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.903426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.905548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.907105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.908685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.910251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.911022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.912339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.913888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.915449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.915760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.918297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.919890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.920469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.920845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.922905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.924659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.926414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.927583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.927888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.929490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.929874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.930729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.931765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.933222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.933611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.933987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.934359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.934842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.936574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.936959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.937333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.937715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.938624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.939014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.939388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.939763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.940175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.941657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.942044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.942417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.942792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.943563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.943945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.944318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.944689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.945098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.946447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.946831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.947218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.947589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.948442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.948825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.949198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.949579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.950000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.951546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.951929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.952304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.952687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.953555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.953939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.954314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.954690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.955099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.957092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.957471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.957848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.958223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.959026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.959402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.959779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.960164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.960605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.962081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.962460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.962838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.963223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.393 [2024-07-15 08:05:27.964103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.964481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.964859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.965232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.965728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.967156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.967536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.967912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.968283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.969179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.969560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.969947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.970319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.970733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.972141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.972520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.972904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.973276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.974087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.974464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.974843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.975230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.975687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.977210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.977591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.977967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.978895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.980543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.982143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.983605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.984016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.984427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.986701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.987084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.987933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.989277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.990588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.992345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.994005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.995766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.996232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:27.998974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.000770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.002119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.003867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.005952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.007542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.007919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.008554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.008909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.010684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.012252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.013822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.015622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.016512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.017700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.019264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.020830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.021093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.023672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.025477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.025885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.026261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.028337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.030096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.031668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.033198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.033523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.034752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.035272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.036827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.038387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.039752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.041310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.042881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.044676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.044993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.048080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.049652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.051452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.051999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.054079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.055890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.057304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.394 [2024-07-15 08:05:28.057679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.058104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.060785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.061782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.063324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.064900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.066209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.066586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.067857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.069426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.069740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.072149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.073728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.075527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.076312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.077701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.079276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.080840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.081103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.082082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.083718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.083759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.083796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.083835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.084193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.084659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.084699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.084744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.084781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.085044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.085938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.085980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.086016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.086052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.086399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.086493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.086529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.086565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.086601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.086904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.087810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.087854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.087890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.087925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.088443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.088594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.088631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.088668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.088705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.088971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.089968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.090962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.091822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.091867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.091903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.091939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.092367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.092456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.092496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.092542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.092578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.093002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.093931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.093973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.094009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.395 [2024-07-15 08:05:28.094044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.094302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.094391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.094429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.094466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.094502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.094765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.095584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.095626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.095662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.095697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.096097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.096188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.096225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.096261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.096297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.096775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.097589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.097631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.097667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.097702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.097967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.098057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.098093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.098130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.098167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.098518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.099514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.099558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.099594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.099630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.099965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.100056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.100098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.100134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.100170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.100541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.101462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.101505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.101541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.101584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.101848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.101948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.101985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.102020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.102056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.102502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.103976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.104452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.105739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.105782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.105818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.105853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.106148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.106240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.106277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.106313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.106349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.106607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.107435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.107478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.107513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.107549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.107811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.107903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.107946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.107982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.108020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.108418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.109359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.109401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.109438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.109473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.396 [2024-07-15 08:05:28.109764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.109854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.109891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.109927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.109963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.110221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.111799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.112152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.113773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.114033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.114981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.115868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.117493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.117535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.117572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.117609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.117946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.118060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.118097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.118134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.118169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.118428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.119889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.120147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.121863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.122154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.123060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.123103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.123139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.123175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.123432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.123522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.123563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.397 [2024-07-15 08:05:28.123599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.123635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.123897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.124979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.125882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.126841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.126884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.126920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.126955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.127273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.127366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.127403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.127439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.127475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.127738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.128851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.128893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.128929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.128965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.129311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.129404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.129440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.129476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.129511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.129806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.130646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.130688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.130730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.130767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.131033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.131123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.131170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.131206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.131245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.131559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.132562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.132606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.132642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.132687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.133189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.133339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.133377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.133413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.133449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.133791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.134700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.134746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.134783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.134818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.135075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.135166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.135217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.135253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.135289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.135612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.136550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.136592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.398 [2024-07-15 08:05:28.136629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.136666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.137102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.137193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.137229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.137272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.137311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.137600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.138417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.138459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.138495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.138531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.138929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.139020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.139059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.139095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.139130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.139435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.140282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.140324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.140373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.140408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.140838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.140933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.140970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.141005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.141041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.141301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.142946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.399 [2024-07-15 08:05:28.143293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.144190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.144232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.144269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.144642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.145166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.145302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.145340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.145376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.145411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.145740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.146606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.146694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.147933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.149486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.151037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.151299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.152347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.154105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.155876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.157679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.157956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.159694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.161397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.163128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.164822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.165295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.168025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.169842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.171282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.172992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.173278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.175051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.176716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.177087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.177627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.177938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.179704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.181276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.182834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.184629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.184993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.185453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.186432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.187996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.189560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.189825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.192400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.194199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.194920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.195291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.195650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.197425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.198327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.199945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.201703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.202154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.204690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.206352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.207399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.208898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.209186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.210128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.210504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.210878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.211249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.211747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.213599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.213983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.214358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.214735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.215190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.215632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.216010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.216388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.216763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.217152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.218573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.218956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.219337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.219718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.220095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.220536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.220925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.662 [2024-07-15 08:05:28.221298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.221684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.222188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.223550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.223947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.224322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.224702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.225107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.225553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.225941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.226314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.226685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.227033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.228476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.228860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.229238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.229609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.229955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.230410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.230785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.231158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.231528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.232022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.233826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.234206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.234580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.234957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.235401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.235861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.236236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.236608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.236983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.237358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.238957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.239343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.239721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.240095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.240441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.240907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.241280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.241653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.242027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.242493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.244082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.244463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.244841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.245219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.245675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.246150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.246525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.246904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.247276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.247640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.249474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.249863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.250237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.250609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.251042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.251499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.251878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.252251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.252624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.253140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.256346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.256732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.258527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.259986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.260417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.260878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.262548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.263414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.264451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.264872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.267665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.268050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.268423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.270176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.270441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.272146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.273314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.274877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.276632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.276967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.279354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.280935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.282691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.283893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.284205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.285845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.287641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.288016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.288388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.288648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.290997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.292566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.294214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.295915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.296179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.296621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.663 [2024-07-15 08:05:28.297006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.298563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.300124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.300386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.302859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.304652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.305954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.306327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.306766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.308417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.309984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.311775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.312745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.313033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.314803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.315185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.316306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.317869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.318183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.320063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.321045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.322612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.324170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.324431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.326682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.328243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.329801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.331593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.331975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.333620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.335175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.336975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.337359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.337703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.340091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.341852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.343163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.344728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.345021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.346881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.347257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.347627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.349385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.349647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.352057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.353776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.355531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.357176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.357603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.358063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.359640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.361201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.362988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.363323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.365849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.367317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.367691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.368339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.368697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.370363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.372158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.373133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.374696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.375038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.376293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.377463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.379026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.380597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.380863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.381993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.383502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.385061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.386852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.387204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.389609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.391173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.392973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.394097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.394387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.396034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.397837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.398210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.398581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.398846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.401171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.402746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.404355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.406016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.406279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.406730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.407104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.408703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.410258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.410521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.413068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.664 [2024-07-15 08:05:28.414830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.416342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.416720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.417185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.418826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.420399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.422197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.423307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.423570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.425639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.426022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.426844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.428406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.428743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.430629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.431600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.433155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.434718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.434980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.437087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.438649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.440209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.442006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.442348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.443976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.445535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.447320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.447951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.448296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.450698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.452493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.453538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.455098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.455397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.457258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.457762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.458134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.459699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.460047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.462263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.463832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.465400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.467157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.467554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.468017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.469782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.471521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.473312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.473577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.476076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.477667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.478043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.478082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.478597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.480258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.480300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.480336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.480372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.480706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.483883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.484207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.486782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.487097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.487934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.487976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.488012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.488048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.931 [2024-07-15 08:05:28.488307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.488411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.488448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.488484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.488521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.488783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.489893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.489935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.489972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.490009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.490269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.490358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.490398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.490435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.490471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.490786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.491675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.491721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.491764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.491801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.492111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.492200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.492236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.492272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.492308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.492623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.493522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.493568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.493605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.493641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.494063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.494154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.494191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.494227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.494262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.494604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.495496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.495540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.495575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.495611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.495875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.495966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.496005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.496041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.496077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.496396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.497956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.498214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.499995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.500878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.500925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.500962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.500997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.501456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.501558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.501595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.501630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.501666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.502012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.502958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.503796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.504612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.504653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.504689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.504730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.505079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.505168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.505208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.505244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.505280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.505716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.506525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.506571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.506607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.506643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.932 [2024-07-15 08:05:28.506904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.506994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.507031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.507067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.507103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.507475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.508448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.508490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.508526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.508562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.508823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.508913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.508959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.509003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.509040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.509513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.510628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.510670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.510707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.510748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.511079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.511168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.511205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.511241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.511277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.511534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.512558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.512600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.512639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.512675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.512968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.513065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.513106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.513142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.513177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.513493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.515392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.515435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.515470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.515506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.515848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.515942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.515979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.516015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.516051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.516359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.517936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.518359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.519355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.519396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.519432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.519470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.519847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.519941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.519978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.520015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.520052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.520311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.521857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.522180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.523813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.524354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.525464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.525507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.525543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.525582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.525926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.526015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.526052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.526088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.526124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.526503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.527573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.527616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.933 [2024-07-15 08:05:28.527652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.527688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.528068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.528159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.528198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.528236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.528271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.528591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.529672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.529717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.529755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.529790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.530155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.530246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.530295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.530332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.530368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.530747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.531682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.531727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.531765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.531801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.532309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.532455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.532493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.532530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.532569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.532995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.533937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.533990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.534028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.534064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.534488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.534579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.534617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.534653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.534689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.534987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.535990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.536042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.536079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.536115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.536541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.536641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.536680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.536722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.536758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.537123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.538110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.538151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.538189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.538225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.538741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.538890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.538934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.538971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.539008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.539455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.540470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.540513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.540550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.540588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.540986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.541078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.541115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.541151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.541187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.541606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.542548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.542591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.542648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.542684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.543148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.543241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.543279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.543314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.543350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.543706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.544881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.544927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.544963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.544998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.545382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.545481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.545519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.545555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.545590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.545905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.547058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.547100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.547137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.547509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.547910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.548051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.548425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.548800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.549171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.549689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.551223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.934 [2024-07-15 08:05:28.551602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.551979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.552353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.552715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.553189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.553572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.553950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.554324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.554684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.556054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.556436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.556813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.557185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.557610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.558088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.558463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.558839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.559218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.559608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.561149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.561532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.561910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.562282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.562674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.563134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.563508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.563884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.564256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.564687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.566087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.566469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.566849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.567222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.567627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.568090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.568465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.568842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.569218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.569691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.571353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.572447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.574244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.576044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.576422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.578289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.580074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.580921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.581293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.581664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.583927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.585490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.585947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.586319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.586734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.588181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.589335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.589706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.590678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.590946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.593660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.595456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.596053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.596426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.596688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.598317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.599880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.601665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.602818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.603109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.604373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.604760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.606499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.608243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.608506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.610023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.611740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.613501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.615257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.615519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.618438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.935 [2024-07-15 08:05:28.620017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.621806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.622874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.623137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.624818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.626609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.627891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.628279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.628742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.631408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.632405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.633964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.635530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.635795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.636755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.637131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.638278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.639832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.640139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.642589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.644155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.645940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.646370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.646752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.648335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.649900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.651499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.653251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.653591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.656367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.656754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.657127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.658885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.659146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.660969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.662484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.664049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.665666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.666017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.667354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.669000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.670586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.672383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.672704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.674559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.676290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.678076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.679541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.936 [2024-07-15 08:05:28.680032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.682795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.684597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.685781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.687553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.687818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.689704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.691154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.691529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.692121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.692459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.694295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.695867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.697421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.699210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.699584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.700059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.701213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.702775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.704345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.704607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.201 [2024-07-15 08:05:28.707186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.708990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.709364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.709747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.710009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.711825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.713584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.715220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.716698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.717002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.718234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.718675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.720250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.721820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.722083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.723214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.724787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.726354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.728147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.728474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.731583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.733145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.734928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.735840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.736103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.737725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.739515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.740706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.741082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.741502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.744199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.745187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.746748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.748311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.748574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.749466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.749845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.751036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.752601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.752942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.755397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.756974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.758761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.759138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.759546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.761301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.763025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.764783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.766469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.766746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.769360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.769746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.770147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.771712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.771976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.773837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.775198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.776928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.778683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.778950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.780724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.782295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.783851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.785635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.786038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.787795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.789428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.791211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.792534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.793025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.795921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.797724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.798835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.800506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.202 [2024-07-15 08:05:28.800772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.802633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.803996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.804379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.805014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.805362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.807211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.808780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.810344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.812136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.812484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.812948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.814179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.815738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.817288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.817550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.820096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.821891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.822264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.822638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.822904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.824674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.826432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.828102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.829573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.829875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.831110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.831503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.833064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.834616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.834885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.836093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.837757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.839377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.841164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.841471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.844776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.846344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.848134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.849234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.849497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.851210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.852996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.854308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.854694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.855156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.857816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.858793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.860355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.861896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.862159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.863217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.863592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.864614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.866167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.866460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.868938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.870520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.872307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.872680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.873030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.874539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.876119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.877696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.879452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.879788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.882545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.882935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.883310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.885064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.885338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.887157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.888360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.889918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.891470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.891812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.893205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.203 [2024-07-15 08:05:28.893589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.893967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.895614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.895884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.897741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.899050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.900384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.901944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.902245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.903596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.903983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.904358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.904396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.904825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.905289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.905330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.905366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.905402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.905828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.907899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.907950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.907990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.908026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.908448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.908574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.908612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.908647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.908696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.909186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.910436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.910479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.910514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.910552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.910961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.911060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.911096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.911132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.911168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.911618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.912864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.912907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.912942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.912979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.913328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.913422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.913459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.913496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.913533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.914021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.915327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.915370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.915409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.915445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.915801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.915895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.915950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.915987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.916022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.916430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.917528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.917571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.917607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.917647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.918115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.918224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.918261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.918297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.918333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.918722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.919871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.919915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.919952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.919992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.204 [2024-07-15 08:05:28.920493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.920648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.920688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.920730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.920768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.921104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.922474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.922522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.922564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.922602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.922959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.923073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.923110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.923146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.923186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.923636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.925996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.926473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.927694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.927741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.927778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.927814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.928209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.928307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.928344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.928393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.928429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.928898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.930755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.931085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.932797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.933287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.934240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.934282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.934318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.934357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.934841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.934937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.934978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.935015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.935059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.935510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.936672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.936728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.936765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.936804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.937066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.937158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.937195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.937233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.937270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.937660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.205 [2024-07-15 08:05:28.938600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.938643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.938679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.938719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.938981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.939074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.939133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.939170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.939206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.939574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.940895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.940938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.940973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.941009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.941484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.941586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.941623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.941659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.941695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.942019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.943910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.945378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.945438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.945475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.945510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.945867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.945963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.946000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.946047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.946084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.946520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.947460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.947503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.947539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.947578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.947963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.948059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.948095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.948131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.948167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.948531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.949910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.949961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.949999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.950037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.950400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.950520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.950556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.950592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.950629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.950894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.951944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.951987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.952023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.952060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.952570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.952725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.952794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.952831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.952869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.206 [2024-07-15 08:05:28.953211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.954456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.954501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.954539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.954575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.954993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.955087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.955124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.955165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.955202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.955660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.956518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.956575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.956612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.956648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.957029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.957151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.957190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.957227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.957262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.957631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.958845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.958889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.958925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.958961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.959291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.959386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.959425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.959462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.959498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.959872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.960903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.960945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.960980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.961017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.961278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.961373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.961412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.961449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.961485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.961884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.962994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.963048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.963084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.963120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.963496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.963589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.963630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.963666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.470 [2024-07-15 08:05:28.963701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.964185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.965962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.966221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.967885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.968171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.969024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.969066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.969102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.969139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.969440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:28.969532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.025951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.027727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.029469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.031260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.034051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.035720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.036093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.036618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.038548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.040337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.041345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.042896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.044413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.045586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.047133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.048688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.049977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.051541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.052459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.054249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.057451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.059022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.060812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.061792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.063761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.065554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.066148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.066520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.069239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.070767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.072383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.074060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.074757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.075133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.076701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.078261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.080940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.082739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.083867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.084239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.086185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.087756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.089541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.090524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.092323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.092722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.094423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.471 [2024-07-15 08:05:29.096164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.098167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.099680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.101240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.102823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.105979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.107747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.109502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.111074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.112957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.114520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.116275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.116645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.119736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.120939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.122658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.124319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.126060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.126442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.127240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.128799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.131607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.133172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.134956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.135716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.137326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.138889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.140448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.142237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.145347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.145780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.146154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.147908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.149954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.151626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.153100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.154655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.156320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.157894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.159458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.161243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.163396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.165153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.166904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.168438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.171763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.173560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.174675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.176423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.178495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.180001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.180373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.181064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.183190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.184760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.186325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.188104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.188921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.190174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.191725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.193289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.196142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.197932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.198307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.198678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.200759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.202513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.204025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.205560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.207144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.207750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.209304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.210860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.212168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.472 [2024-07-15 08:05:29.213727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.473 [2024-07-15 08:05:29.215286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.473 [2024-07-15 08:05:29.217076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.473 [2024-07-15 08:05:29.220392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.473 [2024-07-15 08:05:29.221961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.223751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.224733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.226666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.228463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.229250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.229621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.232554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.233964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.235518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.237081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.237788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.238164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.239871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.241556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.244376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.246175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.247532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.247918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.249963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.251530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.253308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.254263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.256293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.256676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.257971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.259526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.736 [2024-07-15 08:05:29.261700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.263467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.263843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.264214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.266264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.266648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.268198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.269819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.271971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.273234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.274792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.276364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.279449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.279835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.281584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.281963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.282813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.283191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.283564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.283944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.285730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.286114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.286498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.286874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.287698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.288078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.288451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.288831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.290812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.291194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.291568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.291943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.292809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.293188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.293561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.293945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.295835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.296218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.296592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.296972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.297838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.298216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.298254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.298626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.300969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.301026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.301399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.301439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.302036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.302410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.302448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.302822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.304696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.304746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.305119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.305158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.305665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.306041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.306091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.306467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.308404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.308451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.308835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.308875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.309380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.309758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.309797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.310168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.312436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.312483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.312861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.737 [2024-07-15 08:05:29.312899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.313347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.313734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.313772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.314142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.316024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.316071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.316444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.316494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.317092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.317466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.317502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.317878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.319738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.319784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.320156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.320194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.320667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.321050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.322625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.322672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.323059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.323102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.323115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.323526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.323652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.324031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.324045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.324414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.324452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.324887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.325882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.327682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.327724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.327761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.328102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.329741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.329783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.329818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.329855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.330115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.331542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.331584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.331622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.331659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.331920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.332032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.332069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.332106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.332142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.332465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.333674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.333726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.333763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.333798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.334233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.334326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.334363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.334398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.334435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.334754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.336953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.337997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.338040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.338078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.338115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.338606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.738 [2024-07-15 08:05:29.338758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.338798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.338836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.338877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.339304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.340818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.341079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.341940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.341982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.342018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.342054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.342509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.342602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.342639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.342687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.342727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.343175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.344953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.345828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.345870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.345915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.345951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.346445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.346591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.346630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.346677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.346723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.347149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.348763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.349023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.349729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.349772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.349808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.349843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.350203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.350294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.350331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.350367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.350403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.350770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.351631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.351673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.351714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.351753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.352013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.352103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.352140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.352176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.352213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.352558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.357013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.357061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.739 [2024-07-15 08:05:29.488292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.740 [2024-07-15 08:05:29.488353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.489888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.491429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.492387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.492777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.494454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.496168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.498738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.500332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.502063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.502436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.504746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.506496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.508266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.509693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.512416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.512800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.513389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.514932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.517119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.518142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.519701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.521263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.523725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.525277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.526827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.528614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.530698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.532270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.534051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.534992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.537944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.539725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.540703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.542263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.544394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.545197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.545572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.546942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.549186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.550746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.552293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.554075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.554899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.556533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.558217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.559926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.562192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.562573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.562950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.563322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.564214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.565783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.567351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.569135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.572027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.573400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.573791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.574676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.576897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.578006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.579766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.581399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.584250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.585736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.586141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.587649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.588349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.588734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.589109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.589482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.591409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.591796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.592172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.592549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.593479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.593861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.594236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.594606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.596787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.597171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.597544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.597925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.598376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.598752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.599126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.599503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.601266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.601658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.602037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.602411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.603267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.603646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.604035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.604408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.606247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.606628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.607007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.607379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.608181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.608561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.608942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.609314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.611157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.611540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.611918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.612301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.613106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.613484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.613863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.614236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.616164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.616549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.616930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.617302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.618149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.618529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.618909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.619286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.621721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.622104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.622479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.622856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.623629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.624010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.624395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.624772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.626672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.627058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.004 [2024-07-15 08:05:29.627435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.629010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.630683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.631535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.633097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.634885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.637370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.637756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.638786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.640345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.642503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.644268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.644643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.644681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.647353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.648544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.648583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.649278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.649970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.651597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.651637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.652125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.654772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.654820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.656352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.657600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.658960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.659005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.660772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.661146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.662392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.663985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.665180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.665218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.665672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.667436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.667834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.667871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.670194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.671769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.671812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.673377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.675116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.675853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.675903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.677464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.680236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.680282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.681488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.683041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.684946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.684988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.686378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.687410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.688774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.690428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.692019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.692057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.692437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.694070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.695635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.695674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.697971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.698640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.698678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.700428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.702493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.704268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.704307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.706074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.708880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.708926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.709841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.710808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.712670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.712721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.714290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.715911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.717238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.718839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.720352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.720392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.720908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.722201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.723199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.723340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.725468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.727107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.727146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.728702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.729124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.730514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.731851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.733117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.733553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.736065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.736111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.737849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.737888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.739917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.739962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.741689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.743355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.743619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.746055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.746102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.747663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.749243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.749701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.751065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.751104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.752724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.753054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.753923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.754724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.754767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.755627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.755642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.755908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.005 [2024-07-15 08:05:29.756018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.757770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.757811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.759484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.759752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.762307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.762354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.763962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.764018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.764277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.764392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.764450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.765886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.768912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.768959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.768996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.769032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.769356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.769484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.769521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.769557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.769593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.770807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.770851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.770888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.770925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.771184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.771277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.771315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.771359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.771398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.772610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.772667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.772704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.772745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.773041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.773136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.773174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.773210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.773246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.774524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.774567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.774603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.774639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.774904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.775000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.775050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.775087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.775126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.776927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.776970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.777006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.777042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.777324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.777420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.777457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.777494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.777530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.778676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.778724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.778761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.778797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.779105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.779199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.779237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.779274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.779310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.780806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.780849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.780885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.780921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.781238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.781330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.781368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.781407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.781443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.782740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.782786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.782827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.782863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.272 [2024-07-15 08:05:29.783122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.783212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.783250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.783286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.783322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.784451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.784495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.784531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.784568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.784832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.784924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.784961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.784997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.785034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.786859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.787979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.788021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.788057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.788093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.788447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.788537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.789188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.789236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.789273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.790573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.790619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.790655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.790691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.791021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.791113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.791150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.791188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.791225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.792955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.794799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.795862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.795905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.795942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.795977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.796271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.796363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.796400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.796437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.796472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.797756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.797802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.797838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.797874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.798340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.798434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.798471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.798508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.798544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.799763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.799807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.799843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.799879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.800138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.800226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.800264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.800300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.800336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.801485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.801527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.801563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.801599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.801897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.801986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.802023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.802060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.802097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.803640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.804859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.804898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.804934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.805280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.806955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.806997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.807033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.808687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.811486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.811532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.811569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.813325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.813858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.813992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.814030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.814855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.814895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.816003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.816046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.817533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.817573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.817838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.817954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.819517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.819559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.819596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.820852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.822360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.822407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.822458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.822929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.824580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.824621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.824660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.826229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.828942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.828988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.273 [2024-07-15 08:05:29.829024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.830580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.830883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.831005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.831043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.831488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.831525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.832765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.832808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.834364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.834403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.834747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.834878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.836400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.836439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.836475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.837536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.839199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.839242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.839279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.839606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.840345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.840386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.840424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.842098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.844846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.844891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.844941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.846503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.846771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.846878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.846915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.848295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.848334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.849638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.849682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.851414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.851453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.851717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.851827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.853401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.853440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.853476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.854668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.856239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.856279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.856315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.856612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.857330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.857385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.857422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.857459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.860071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.860116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.861675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.861717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.862024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.862132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.863693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.863734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.863771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.865368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.865414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.865449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.866553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.866823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.868460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.868501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.870064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.870102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.871274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.872927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.872970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.873006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.873267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.875064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.875105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.875878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.877013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.877057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.877320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.878298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.879908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.879948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.881356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.881657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.883287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.884865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.885046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.885870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.886416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.888159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.889907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.890260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.890382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.891938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.892890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.894658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.895168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.897771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.898225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.899656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.900034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.900386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.902024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.903584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.905148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.906260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.906591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.908701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.910061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.911288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.911947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.912230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.913857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.915426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.916263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.917814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.918137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.920199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.920829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.921696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.923480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.923820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.925210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.926996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.928080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.928889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.929153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.930922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.932276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.933228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.934166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.934430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.934874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.935250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.274 [2024-07-15 08:05:29.936432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.937138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.937401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.939022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.940509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.941587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.942398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.942732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.943196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.944957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.945676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.946853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.947149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.948797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.950571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.950949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.951322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.951679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.952683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.954342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.954727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.955100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.955548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.957100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.957482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.957860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.959514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.959951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.961169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.961547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.961927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.963646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.964143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.965376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.965839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.967432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.968373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.968707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.969162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.969539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.971304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.971978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.972255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.974245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.975018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.976753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.977127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.977484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.978410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.979431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.980972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.981344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.981753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.983700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.984541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.984919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.985293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.985557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.986410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.987486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.987627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.990248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.991297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.992136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.992511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.992905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.993044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.994663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.995648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.996549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.996892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.998293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:29.999638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.000019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.000393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.000701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.001210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.003075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.003457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.003840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.004173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.005488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.005878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.006806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.007750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.008015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.008474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.008860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.009759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.010688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.010957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.012655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.013996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.015373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.015839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.016196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.016682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.018348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.019211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.020213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.020531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.275 [2024-07-15 08:05:30.022254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.024014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.024391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.024768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.025095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.026044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.027727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.028106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.029546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.029893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.031444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.033217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.033861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.035408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.035720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.037583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.038525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.040071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.041619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.041887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.043949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.044596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.044973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.045376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.045640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.046739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.047523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.540 [2024-07-15 08:05:30.048727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.049343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.049634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.051853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.051946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.052380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.053564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.053905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.055732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.056479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.058009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.058048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.058311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.059297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.060949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.061326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.061365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.061797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.063430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.063846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.063884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.065128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.065474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.067360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.068916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.068955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.069880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.070215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.071957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.072009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.072629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.073980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.074255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.076630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.076674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.078432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.080206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.080589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.080726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.081799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.083271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.083309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.083790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.084603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.086389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.087570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.087611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.087879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.089502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.091277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.091315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.092372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.092639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.094962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.096522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.096561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.098297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.098566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.100115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.100156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.101888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.103644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.103913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.106698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.106746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.107850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.109399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.109718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.109865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.111622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.112880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.112918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.113213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.114097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.115692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.116154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.116192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.116458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.117563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.119113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.119151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.120703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.120972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.123478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.125266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.125305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.126485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.126914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.128742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.128782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.129445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.130996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.131304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.133775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.133834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.135388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.137179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.137558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.137692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.138776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.140210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.140247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.140682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.141491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.143283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.143322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.144282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.144549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.146183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.147977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.149094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.150664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.151037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.151848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.153481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.155191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.155232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.155496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.155611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.157373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.159128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.159169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.159434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.160910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.160955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.161892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.163357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.163675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.165433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.165476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.166998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.167036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.167297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.169773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.169818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.541 [2024-07-15 08:05:30.171577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.171615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.172042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.173323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.173364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.173400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.174981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.175332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.177503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.177548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.177585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.177620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.177885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.177998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.178036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.179813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.179858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.180194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.181803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.182109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.182987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.183848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.184679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.184725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.184762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.184797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.185146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.185243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.185280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.185316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.185364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.185627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.186624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.186666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.186702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.186742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.187034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.187129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.187167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.187219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.187254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.187521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.188941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.189318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.190875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.191192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.192649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.193002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.193962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.194005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.194041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.194077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.194580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.194731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.194771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.194808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.194844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.195246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.196733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.197175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.198867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.200066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.200113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.200150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.200186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.200447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.202180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.202222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.202258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.202298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.202560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.203426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.203469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.203504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.203540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.203819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.203913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.203950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.203986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.204022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.204283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.205531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.205576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.205612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.205648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.206069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.206165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.206202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.206238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.206273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.206531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.207359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.207402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.207438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.207473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.207853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.207947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.207983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.208019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.208055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.208384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.209883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.210144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.542 [2024-07-15 08:05:30.211131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.211177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.211214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.211249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.211617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.211708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.211751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.211787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.211823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.212082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.213745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.214067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.215032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.215968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.216008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.216045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.216309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.216417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.216463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.216499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.218061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.218323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.220826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.220871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.220910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.222704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.223114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.223251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.223289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.224188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.224226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.224487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.225726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.225768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.227335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.227373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.227633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.227783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.228857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.228896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.228933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.229242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.229991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.231061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.231101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.231141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.231568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.233426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.233467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.233503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.234416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.234749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.236531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.236589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.236630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.238187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.238540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.238663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.238704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.240401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.240439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.240699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.242817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.242863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.244422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.244460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.244724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.244868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.246484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.246522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.246558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.246821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.247623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.249289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.249328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.249363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.249670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.250399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.250439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.250476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.252085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.252349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.254869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.254916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.254953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.256522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.256788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.256905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.256943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.258003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.258041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.258396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.262120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.262166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.263948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.263985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.264346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.264474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.266030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.266068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.266104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.266408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.270729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.271136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.271174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.271209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.271555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.273232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.273277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.273312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.275093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.275526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.281060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.281107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.282612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.282649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.283149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.283278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.283316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.283351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.283387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.283722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.288621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.288667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.288703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.290487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.543 [2024-07-15 08:05:30.290838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.806 [2024-07-15 08:05:30.291896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.806 [2024-07-15 08:05:30.291938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.806 [2024-07-15 08:05:30.291974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.293742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.294182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.297676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.298321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.298361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.298398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.298748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.298863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.300494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.300533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.300908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.301259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.304216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.305734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.305773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.306757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.307086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.307205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.308765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.310561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.310599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.310934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.315340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.316170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.317869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.318242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.318504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.320321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.321912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.322065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.326849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.327725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.328350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.329611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.329956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.330080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.331136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.332315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.333661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.334135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.339294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.340851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.342440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.344193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.344505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.345173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.346940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.347317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.348995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.349260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.352655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.353818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.354192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.354570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.354839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.355644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.356772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.357145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.357519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.357792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.360925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.361617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.363385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.363761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.364157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.364981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.366105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.367531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.367997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.368420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.370714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.371095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.372823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.373428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.373696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.374140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.374526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.375723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.376393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.807 [2024-07-15 08:05:30.376656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.379952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.380413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.380799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.381313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.381578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.382674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.383516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.383893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.384267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.384728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.387072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.387467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.387844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.388219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.388574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.389035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.389412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.389794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.390176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.390644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.391889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.392269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.392656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.393039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.393423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.393885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.394261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.394635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.395017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.395559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.397009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.397388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.397765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.398151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.398515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.398976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.399353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.399732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.400106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.400475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.402666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.403053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.403428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.403804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.404209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.404669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.405049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.405427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.405803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.406269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.410040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.411840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.412718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.414285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.414586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.416451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.417524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.417663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.421348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.421751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.423250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.424290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.424639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.424681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.425121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.426541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.427372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.427714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.431500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.433289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.433997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.434370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.434631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.435152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.436600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.436992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.437366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.437752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.442215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.808 [2024-07-15 08:05:30.443421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.444294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.444678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.445016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.446680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.447652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.449141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.450726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.451128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.455538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.457302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.459058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.460814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.461079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.461527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.462300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.463863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.465425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.465756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.470485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.470868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.472198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.473750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.474037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.475670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.476808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.478366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.479928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.480257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.485159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.486917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.488406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.489974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.490300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.492003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.492380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.492756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.494303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.494569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.500379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.500425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.501719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.502092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.502439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.504103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.505657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.507221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.508422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.508741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.512278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.514000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.515755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.515793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.516056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.517817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.519429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.521106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.521144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.521407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.526051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.527620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.527658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.528857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.529147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.530817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.532382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.532420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.533691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.534108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.539872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.539919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.541251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.542906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.543170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.544864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.544905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.546256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.547803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.548226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.809 [2024-07-15 08:05:30.551880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.553636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.555126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.555165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.555462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.555575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.557328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.559084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.559122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.810 [2024-07-15 08:05:30.559609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.563644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.565004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.565043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.566603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.566956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.568198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.568575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.568612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.569991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.570287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.575512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.575560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.577105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.577484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.577966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.579795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.579838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.581400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.582958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.583224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.584038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.585608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.586990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.587027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.587446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.587591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.588903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.590463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.590500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.590828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.596443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.596896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.596935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.597305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.597570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.599207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.600769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.600807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.602002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.602266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.607265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.607312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.608870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.610434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.610742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.612041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.612082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.613633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.615195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.615511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.618737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.620503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.620553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.622114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.622379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.622493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.624152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.625756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.625795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.079 [2024-07-15 08:05:30.626069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.629090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.630668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.631860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.631898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.632204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.633847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.635413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.635952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.636325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.636587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.642544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.642589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.644357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.646081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.646589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.646732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.647429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.648991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.649029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.649374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.654936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.654984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.655879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.655917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.656266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.657935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.657976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.659731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.659769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.660030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.665963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.666010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.666049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.666085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.666347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.667494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.667535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.667571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.667985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.668248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.672532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.672583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.672623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.672659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.673023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.673115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.673152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.674716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.674754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.675156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.678751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.679054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.682497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.682542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.682579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.682614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.682885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.682979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.683018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.683053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.683090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.683395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.687466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.687512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.687548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.687586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.687875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.687977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.688016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.688052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.688100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.688358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.691397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.691443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.691479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.691515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.691808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.691905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.691946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.691983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.692019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.692461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.695493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.695541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.695577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.695613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.695915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.696010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.696048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.696085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.080 [2024-07-15 08:05:30.696122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.696663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.699472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.699522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.699558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.699593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.699886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.699987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.700024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.700061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.700097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.700418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.703939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.704204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.707485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.707531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.707577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.707613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.707883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.707974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.708012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.708048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.708084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.708377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.711996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.712345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.716444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.716496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.716533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.716571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.716835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.716927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.716979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.717015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.717051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.717424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.720278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.720324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.720359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.720395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.720655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.721088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.721130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.721165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.721204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.721694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.726259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.726307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.726344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.726379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.726868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.727026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.727065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.727107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.727160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.727528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.729598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.729645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.729681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.729722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.730183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.730332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.730369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.730406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.730442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.730863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.732637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.732687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.732728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.732764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.733173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.733263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.733301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.733338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.733374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.733775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.735956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.736005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.736041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.736076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.736567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.736698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.736741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.081 [2024-07-15 08:05:30.736781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.736820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.737146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.739781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.740275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.742135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.742518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.742558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.742594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.743032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.743156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.743193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.743230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.743265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.743636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.746176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.746222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.746259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.746630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.747096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.747220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.747257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.747294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.747671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.748036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.750160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.750207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.750578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.750621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.751106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.751252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.751290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.751673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.751716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.752081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.753873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.754253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.754292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.754332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.754735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.754884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.755262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.755302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.755338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.755754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.758018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.758064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.758101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.758473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.758839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.759284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.759324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.759360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.759740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.760156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.762144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.762192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.762576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.762614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.762970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.763121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.763164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.763549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.763587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.763891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.768444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.768831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.768871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.768908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.769173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.769324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.770886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.770925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.770961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.771321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.776072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.776122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.776160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.776531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.776942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.777402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.777442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.777483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.779047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.779501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.782311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.782359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.783913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.783951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.784293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.784436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.784474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.082 [2024-07-15 08:05:30.786027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.786065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.786351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.789435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.789825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.789863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.789910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.790427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.790551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.792136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.792174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.792209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.792469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.796407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.796455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.796833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.796872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.797274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.797738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.797779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.797816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.799145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.799449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.803257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.803304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.803341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.804686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.804955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.805064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.805102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.805142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.805179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.805437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.809597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.809984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.810023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.810059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.810386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.812038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.812079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.812114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.813755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.814017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.817726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.818107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.818145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.818884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.819184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.819291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.820851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.820889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.822666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.823006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.083 [2024-07-15 08:05:30.827407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.827794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.829446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.831122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.831400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.831517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.833001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.834572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.834613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.834880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.839892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.379 [2024-07-15 08:05:30.841516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.843295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.844571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.844838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.846666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.848416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.848556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.853330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.854841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.856421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.858051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.858367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.858474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.860053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.860437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.861117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.861438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.866355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.868142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.868955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.869332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.869596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.871247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.872809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.874557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.875860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.876166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.880794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.882385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.884167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.885400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.885663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.887484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.889238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.890731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.891104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.891587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.897382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.898945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.900734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.901680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.902100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.903312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.904871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.906425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.908197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.908583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.913345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.914859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.916426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.918070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.918336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.919681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.921232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.922783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.924590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.924999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.929241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.930994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.932740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.934501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.934770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.935232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.935973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.937525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.939080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.939343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.943457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.943842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.945358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.946932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.947289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.948934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.950389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.951944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.953525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.953793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.958768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.959897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.961552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.963178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.963448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.964817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.965201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.966175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.967713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.968026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.974140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.974862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.975235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.976767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.977099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.978830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.980600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.981955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.983510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.983812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.988537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.990329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.991539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.380 [2024-07-15 08:05:30.993292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:30.993558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:30.995434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:30.996839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:30.997221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:30.998050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:30.998353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.003256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.005049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.005742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.006115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.006383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.007984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.009549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.009688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.013809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.014192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.015652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.017209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.017529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.017572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.019353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.020367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.021932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.022226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.027142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.028900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.030411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.031974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.032304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.034059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.035802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.036175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.036667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.036962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.040651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.042429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.042989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.044778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.045137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.045611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.046807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.048368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.049932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.050287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.054713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.056365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.057941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.059462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.059732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.061383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.063010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.063385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.063762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.064106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.068846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.069532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.071079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.072640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.072908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.073916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.075479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.077056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.078825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.079229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.083838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.083890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.084260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.085209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.085565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.087216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.089025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.089910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.091469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.091812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.095108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.096583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.097044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.097082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.097390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.098948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.099325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.099697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.100074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.100480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.102859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.103242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.103280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.103653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.104144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.104604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.104985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.105359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.105396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.105829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.107997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.108043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.108414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.108811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.381 [2024-07-15 08:05:31.109214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.109671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.110052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.110093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.110472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.110856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.112753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.113143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.113518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.113556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.114040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.114499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.114539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.114917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.115292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.115651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.118188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.118569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.118609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.118985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.119368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.119493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.119872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.120249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.120287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.120699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.122941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.122987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.123367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.123746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.124159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.124618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.124997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.125036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.125410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.125747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.127659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.128047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.128421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.128460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.128849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.129319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.129360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.129737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.130111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.382 [2024-07-15 08:05:31.130609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.646 [2024-07-15 08:05:31.135090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.646 [2024-07-15 08:05:31.135473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.646 [2024-07-15 08:05:31.135511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.646 [2024-07-15 08:05:31.135951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.646 [2024-07-15 08:05:31.136234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.646 [2024-07-15 08:05:31.136361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.646 [2024-07-15 08:05:31.137926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.139715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.139752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.140167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.144767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.144813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.145187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.145563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.146012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.146471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.147990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.148028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.149048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.149384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.152706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.153972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.154010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.155744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.156019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.157753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.157794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.158165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.158538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.158911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.160788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.161390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.163139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.163177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.163480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.163595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.163987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.164886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.164924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.165253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.169991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.170038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.170658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.171920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.172382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.173057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.174329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.175865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.176241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.176705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.180529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.180575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.180952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.180990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.181254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.181369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.183127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.184884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.184922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.185184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.189858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.189905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.189941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.189977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.190242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.192000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.192041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.193790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.193833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.194097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.198387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.198434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.198470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.198506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.198856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.200139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.200180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.200216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.201888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.202193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.206351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.206411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.206447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.206483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.206857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.206953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.206990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.207363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.207401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.207664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.208626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.208669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.208705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.208747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.209057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.209155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.209192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.209228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.209264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.209564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.212822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.212869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.212905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.212941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.213227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.213322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.647 [2024-07-15 08:05:31.213360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.213396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.213432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.213750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.214671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.214722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.214759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.214795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.215131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.215228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.215265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.215301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.215337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.215697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.219738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.220060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.221670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.222015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.225944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.225996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.226034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.226069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.226352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.226446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.226483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.226519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.226555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.226861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.227877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.227920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.227959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.227995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.228468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.228564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.228604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.228640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.228675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.228973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.232734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.232780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.232816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.232852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.233113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.233204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.233242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.233278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.233314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.233613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.234659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.234706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.234756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.234796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.235057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.235152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.235189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.235225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.235261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.235607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.236635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.236678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.236719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.236758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.237020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.237111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.237149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.237186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.237222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.237528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.238954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.238997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.239033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.239068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.239379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.241020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.241061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.241098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.241137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.241397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.242292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.242334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.242370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.242409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.242720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.242815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.648 [2024-07-15 08:05:31.242852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.242889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.242926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.243369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.244601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.244648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.244684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.244724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.245026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.245121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.245157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.245194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.245236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.245495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.246968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.247404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.248459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.248502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.248538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.248577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.248869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.248961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.249004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.249041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.249081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.249341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.250227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.251984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.252025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.252060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.252340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.252448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.252486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.252521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.252558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.252952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.255528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.255577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.255616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.257137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.257399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.257510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.257548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.257584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.257619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.257916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.258948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.258991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.259363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.259401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.259668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.259818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.259857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.259894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.261460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.261774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.262783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.264354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.264393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.264432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.264747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.264875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.264913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.265288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.265326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.265765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.268171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.268216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.268251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.269450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.269753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.269872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.271437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.271475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.271511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.271775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.275026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.275075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.276606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.276654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.276950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.278581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.278622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.278658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.280196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.280543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.283698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.285480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.285520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.285556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.285848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.285959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.649 [2024-07-15 08:05:31.286000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.287557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.287596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.287893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.289214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.289260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.289297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.290825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.291155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.291279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.293034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.293073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.293109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.293367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.294197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.294243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.295807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.295846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.296153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.296600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.296641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.296679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.298391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.298673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.299511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.301273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.301314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.301351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.301619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.301741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.301780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.303340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.303378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.303672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.306064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.306109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.307854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.307892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.308153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.308262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.309999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.310037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.310074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.310335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.311504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.311549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.311586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.312814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.313140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.314719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.314761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.314796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.316435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.316769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.317733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.319348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.319387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.319423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.319861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.320015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.320054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.320090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.320125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.320475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.321429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.322998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.323038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.324244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.324587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.326241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.326283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.326319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.328056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.328504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.329735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.330114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.330488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.331422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.331773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.331904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.333471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.333510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.334274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.334570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.335872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.337075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.338642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.339876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.340223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.340345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.341903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.343130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.343168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.343453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.346102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.347078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.348771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.350496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.350880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.650 [2024-07-15 08:05:31.351351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.352340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.352481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.354867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.356561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.358321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.358696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.359198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.359325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.360919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.362480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.363952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.364216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.365481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.365868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.366901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.368454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.368760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.370613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.372394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.372773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.373222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.373502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.375567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.377131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.378676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.380230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.380517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.380980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.381358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.383089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.384782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.385190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.390251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.390719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.392126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.392501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.392860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.393316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.393698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.394081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.394455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.394927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.397029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.397411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.397790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.398164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.398516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.651 [2024-07-15 08:05:31.398983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.399363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.399745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.400119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.400466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.401786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.402169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.402548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.402925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.403295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.403743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.404121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.404496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.404873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.405349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.407030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.915 [2024-07-15 08:05:31.407412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.407791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.408173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.408601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.409069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.409447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.409842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.410216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.410612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.412101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.412483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.412866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.413249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.413707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.414166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.414543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.414923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.415299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.415694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.417494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.417878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.418253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.418627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.419033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.419480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.419861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.420236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.420609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.421002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.422415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.422800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.423175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.423547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.423961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.424564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.426125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.427904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.428886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.429151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.430327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.430972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.432526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.433704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.433971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.435595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.436054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.436348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.438174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.439854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.440228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.440601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.440922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.440965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.442037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.443257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.443919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.444270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.445734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.447305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.448799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.449173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.449606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.450069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.450448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.450827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.451772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.452148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.454661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.456237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.457597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.457979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.458482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.458946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.459322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.459698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.460792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.461089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.463578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.465370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.466413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.466791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.467167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.467626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.468008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.468384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.469958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.470222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.471881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.471927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.472301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.473763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.474115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.475952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.477714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.479066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.480419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.480687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.483853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.485434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.487230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.916 [2024-07-15 08:05:31.487272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.487595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.489242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.490816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.492605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.493513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.493955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.496496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.498292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.498332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.499299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.499645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.501310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.503096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.503512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.503891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.504156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.506581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.506627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.508352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.510106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.510373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.512087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.512465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.512879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.512917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.513255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.516657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.518237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.520028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.520067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.520400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.520864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.522359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.522398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.524152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.524420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.526876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.528623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.528662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.530124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.530609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.531085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.531125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.532681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.534251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.534514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.537001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.537046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.538834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.539677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.540094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.540221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.541430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.542990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.543027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.543332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.544202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.545831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.547496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.547546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.547819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.548269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.548769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.548809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.550366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.550673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.553178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.554742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.554781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.556565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.556906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.557375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.557415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.558861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.560420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.560756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.563121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.563166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.564820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.566578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.567029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.567158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.567532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.569109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.569148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.569470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.570387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.571948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.571987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.573548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.573816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.574587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.574970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.575011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.576575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.576935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.577795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.579385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.581024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.917 [2024-07-15 08:05:31.581063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.581324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.582981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.583023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.583400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.584006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.584354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.586125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.586170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.587726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.589292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.589556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.589667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.590171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.590547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.590587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.590856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.593193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.593238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.594983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.595023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.595286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.597146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.598521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.598911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.599549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.599919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.601691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.601741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.601780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.601817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.602154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.602265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.603825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.605617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.605655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.606083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.607009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.607053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.607089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.607124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.607413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.609254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.609295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.610271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.610310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.610598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.611509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.611552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.611589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.611625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.612039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.612470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.612514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.612550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.614098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.614398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.615320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.615362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.615398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.615434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.615737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.615832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.615869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.617431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.617470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.617734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.618853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.618904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.618940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.618976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.619288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.619381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.619417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.619453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.619490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.619805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.620723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.620766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.620805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.620840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.621101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.621196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.621234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.621274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.621310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.621569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.622745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.622788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.622824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.622861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.623122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.623222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.623259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.623310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.623346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.623638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.918 [2024-07-15 08:05:31.624554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.624604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.624640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.624675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.624989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.625085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.625122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.625158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.625194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.625453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.626502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.626546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.626583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.626618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.626972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.627067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.627104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.627143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.627179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.627490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.628992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.629317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.630319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.630362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.630398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.630435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.630860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.630953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.630994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.631045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.631081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.631374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.632849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.633111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.633948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.634001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.634039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.634074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.634479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.634571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.634608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.634645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.634681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.635045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.636891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.637208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.638191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.638233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.638269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.638306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.638643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.640479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.640520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.640556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.640599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.641017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.641886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.641931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.641969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.642005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.642325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.642415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.642452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.642489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.642525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.642853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.643678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.643724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.643761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.643796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.644150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.919 [2024-07-15 08:05:31.644240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.920 [2024-07-15 08:05:31.644277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.920 [2024-07-15 08:05:31.644313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.920 [2024-07-15 08:05:31.644349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.920 [2024-07-15 08:05:31.644714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.645620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.645662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.645721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.645759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.646071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.646160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.646196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.646233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.646269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.646536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.647698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.648570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.648610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.648646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.649017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.649142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.649179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.649218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.649256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.649545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.652242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.652287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.652323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.652694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.653192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.653336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.653374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.653410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.653445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.653776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.654763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.654806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.656360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.656400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.656694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.656817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.656854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.656890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.656926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.657265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.658214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.660003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.660043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.660082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.660423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.660550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.660587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.660623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.662409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.662837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.665762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.665808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.665844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.667401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.667664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.667773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.921 [2024-07-15 08:05:31.667811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.668855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.668895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.669216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.670117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.670161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.670534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.670577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.671026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.671173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.672759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.672797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.672833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.673164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.674226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.674691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.674735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.674771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.675079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.676725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.676766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.676802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.678620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.678967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.681685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.681734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.681771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.682143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.682555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.682680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.682727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.684356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.684394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.684653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.685661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.685704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.686714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.686754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.687261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.687386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.687763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.687801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.687837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.688198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.689882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.690265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.690303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.690340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.690749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.691211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.691253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.691289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.691661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.692120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.693582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.693627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.694010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.694050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.694424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.694534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.694573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.694959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.694998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.695317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.184 [2024-07-15 08:05:31.696750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.696796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.696834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.697206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.697528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.697653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.698326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.698365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.698402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.698743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.699747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.701374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.701413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.701449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.701888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.703123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.703164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.703202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.703572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.703967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.705294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.706418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.706457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.706833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.707172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.707297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.707336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.707372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.707407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.707668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.708649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.709033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.709407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.710939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.711379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.712469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.712512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.712548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.712923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.713303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.715373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.715759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.716138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.717892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.718337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.718462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.719669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.719712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.720085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.720530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.723425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.723810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.724189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.725344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.725734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.725865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.727635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.728012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.728049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.728463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.730923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.731303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.731675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.732399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.732699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.734000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.734623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.734808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.736743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.738520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.738909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.739281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.739730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.739870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.740766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.742415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.742796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.743280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.745232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.746034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.746407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.746785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.747262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.747728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.749488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.751047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.751929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.752194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.754319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.755874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.757436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.758260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.758577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.760226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.761388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.761765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.185 [2024-07-15 08:05:31.762138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.762488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.763938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.765634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.766011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.766384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.766678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.768340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.769904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.770644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.772193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.772519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.774208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.774591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.774977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.775350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.775669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.777301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.779055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.780009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.781563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.781830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.783144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.783526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.783917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.784290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.784552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.786016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.787133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.788921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.789364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.789761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.792445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.792852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.793225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.793994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.794326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.795625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.796250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.796628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.798229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.798585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.800578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.802164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.802840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.804491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.804758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.805204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.805790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.807334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.808894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.809163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.811581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.813340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.814285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.814658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.814924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.816545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.818093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.819705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.820942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.821288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.822836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.823218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.824963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.826704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.826973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.828790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.830360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.831976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.833627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.833933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.836884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.838626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.840381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.842066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.842330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.843996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.845665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.845845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.847980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.849539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.851079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.852641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.852953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.852996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.854546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.856115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.857673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.858004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.860424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.861986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.863541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.864673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.864982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.866640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.868198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.868701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.869077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.186 [2024-07-15 08:05:31.869342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.871738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.873476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.875224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.876942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.877205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.877647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.878320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.879868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.881418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.881705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.884113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.884158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.885716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.886691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.887068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.888449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.889995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.891550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.893105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.893396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.894361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.895933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.896309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.896346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.896832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.898555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.900143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.901723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.903113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.903379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.905704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.906093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.906130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.907253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.907563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.909207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.910754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.911961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.913517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.913818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.915137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.915182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.916925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.918671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.918936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.920758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.922228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.923787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.925388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.925726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.926909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.928593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.930224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.930262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.930559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.931989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.933631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.935208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.935247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.935552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.187 [2024-07-15 08:05:31.937514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.939062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.939101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.940659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.941037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.942329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.943873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.943911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.945451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.945764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.948333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.948378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.950116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.951864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.952127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.953729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.953771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.955517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.957260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.957526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.958577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.960326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.962091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.962131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.962393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.962503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.963795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.965368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.965406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.965713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.966876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.968052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.968091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.969640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.969947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.971632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.972557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.972596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.974155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.974494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.975797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.975842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.977493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.979094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.979356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.980988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.981029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.982695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.984333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.984596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.985797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.986704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.986747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.988280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.988597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.988724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.990265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.991393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.991431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.991756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.992720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.993105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.994849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.994889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.995150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.996891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.998273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.998313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:31.999932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:32.000261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:32.001569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:32.001614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:32.001992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.451 [2024-07-15 08:05:32.002365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.002699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.004356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.004397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.005956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.007513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.007818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.010310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.010355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.010865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.010904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.011395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.011520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.013116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.014866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.014905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.015166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.017590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.017635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.017678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.017717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.018230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.018771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.020188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.021067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.022055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.022552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.023522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.023566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.023602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.023638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.023940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.024032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.025598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.026781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.026819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.027146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.028122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.028165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.028201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.028237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.028701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.029130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.029171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.030752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.030790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.031108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.032064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.032105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.032145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.032181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.032549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.032984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.033025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.033061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.034628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.034974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.035977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.036018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.036054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.036089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.036400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.036499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.036536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.038269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.038308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.038784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.040742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.452 [2024-07-15 08:05:32.041072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.041744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.041785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.041821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.041860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.042211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.042305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.042342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.042378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.042414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.042799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.043808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.043854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.043890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.043926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.044268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.044365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.044401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.044438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.044473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.044763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.045704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.045750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.045786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.045821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.046079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.046174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.046211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.046247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.046283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.046587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.047528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.047570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.047607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.047646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.047982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.048075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.048112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.048148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.048184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.048443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.049466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.049508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.049544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.049579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.049968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.050061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.050099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.050148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.050184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.050442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.051525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.051572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.051647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.051684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.051947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.052039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.052080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.052116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.052153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.052630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.053601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.053657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.053694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.053739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.054150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.054240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.054277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.054313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.054348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.054743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.055903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.055949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.055985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.056022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.056426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.056521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.056559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.056595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.056631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.057078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.453 [2024-07-15 08:05:32.058581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.058624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.058660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.058697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.059073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.059165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.059202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.059240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.059276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.059782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.060916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.060961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.060997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.061032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.061423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.061857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.061897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.061933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.061972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.062443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.064994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.065431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.066526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.066570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.066606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.066642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.067133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.067224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.067272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.067308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.067343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.067721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.068841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.069220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.069258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.069297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.069699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.069857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.069897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.069948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.069984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.070422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.071919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.071966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.072002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.072374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.072733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.072866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.072904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.072944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.072983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.073436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.074493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.074545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.074923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.074962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.075403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.075549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.075586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.075622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.075657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.076054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.077147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.077538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.077577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.077618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.077968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.078122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.078161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.078198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.078235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.078728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.454 [2024-07-15 08:05:32.080470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.080516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.080552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.080927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.081336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.081458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.081495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.081532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.081907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.082280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.083352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.083395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.083781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.083819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.084232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.084378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.084414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.084789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.084831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.085297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.086517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.086900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.086939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.086986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.087470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.087612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.087993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.088059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.088097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.088524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.091267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.091313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.091349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.093132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.093548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.095223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.095267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.095303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.096872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.097134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.098198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.098241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.099982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.100023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.100411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.100537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.100577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.101432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.101471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.101987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.103453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.103852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.103891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.103926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.104236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.104383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.104759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.104800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.104837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.105162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.107860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.107917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.109302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.109353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.109843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.110714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.110755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.110793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.111453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.111718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.113218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.113263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.113299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.114445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.114786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.114916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.455 [2024-07-15 08:05:32.114954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.115326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.115364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.115829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.116691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.117857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.117895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.117930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.118245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.118364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.118739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.118781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.118818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.119283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.120456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.120840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.120878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.122300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.122593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.124274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.124317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.124353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.125581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.125910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.127003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.127382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.127759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.128133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.128562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.128685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.128726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.128764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.128800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.129105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.130923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.132481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.134255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.134802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.135160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.135619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.135660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.135698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.136079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.136594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.137987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.139567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.140798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.141182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.141575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.141699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.143046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.143084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.144832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.145094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.456 [2024-07-15 08:05:32.147903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.149694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.150069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.150442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.150703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.150844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.152411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.154197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.154234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.154604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.157189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.158241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.158612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.159807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.160129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.161757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.163543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.163757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.166331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.167579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.167954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.168935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.169263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.169387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.170953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.172738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.173754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.174072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.175391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.175779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.177520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.179263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.179528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.181153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.182717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.184340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.186011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.186273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.189373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.190863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.192652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.193827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.194090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.195906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.197705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.199096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.199481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.199893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.457 [2024-07-15 08:05:32.202646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.203641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.205196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.206752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.207016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.207827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.208206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.209758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.211368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.211695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.214135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.215860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.217546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.217922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.218368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.220005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.221569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.223347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.224400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.224662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.226435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.226824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.228218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.229773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.230079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.231940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.233152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.234705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.236266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.236528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.239486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.241246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.243017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.244433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.244697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.246457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.248210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.249864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.250238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.250741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.253384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.254363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.255912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.257469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.257736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.258826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.259203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.260462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.262007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.262320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.264724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.266284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.268066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.268439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.268880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.270704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.272452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.274202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.275703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.275981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.278357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.278744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.279536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.281076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.281377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.720 [2024-07-15 08:05:32.283251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.284141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.285694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.287251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.287512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.289860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.291424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.292982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.294753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.295084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.296696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.298266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.298404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.300566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.302121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.303671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.305451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.305849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.305892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.307441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.309010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.310794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.311224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.313800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.315362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.317152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.318131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.318482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.320126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.321916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.322384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.322760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.323023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.325092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.325139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.326724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.328354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.328736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.329193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.329570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.329962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.330336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.330666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.331693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.333467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.334862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.334900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.335266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.337026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.338744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.339116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.339621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.339912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.342519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.344138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.344177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.345801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.346238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.346720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.347586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.348576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.348956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.349401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.351848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.721 [2024-07-15 08:05:32.351894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.292 00:29:48.292 Latency(us) 00:29:48.292 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:48.292 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:48.292 Verification LBA range: start 0x0 length 0x100 00:29:48.292 crypto_ram : 5.65 47.98 3.00 0.00 0.00 2591760.17 13611.32 2168132.53 00:29:48.292 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:48.292 Verification LBA range: start 0x100 length 0x100 00:29:48.292 crypto_ram : 5.82 43.95 2.75 0.00 0.00 2832966.89 120989.54 2735976.76 00:29:48.292 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:48.292 Verification LBA range: start 0x0 length 0x100 00:29:48.292 crypto_ram2 : 5.65 47.97 3.00 0.00 0.00 2501313.80 13006.38 2168132.53 00:29:48.292 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:48.292 Verification LBA range: start 0x100 length 0x100 00:29:48.292 crypto_ram2 : 5.83 43.94 2.75 0.00 0.00 2723345.33 120182.94 2735976.76 00:29:48.292 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:48.292 Verification LBA range: start 0x0 length 0x100 00:29:48.292 crypto_ram3 : 5.54 339.85 21.24 0.00 0.00 339645.54 45371.08 432335.95 00:29:48.292 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:48.292 Verification LBA range: start 0x100 length 0x100 00:29:48.292 crypto_ram3 : 5.62 271.35 16.96 0.00 0.00 419617.01 12754.31 490410.93 00:29:48.292 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:48.292 Verification LBA range: start 0x0 length 0x100 00:29:48.292 crypto_ram4 : 5.61 356.81 22.30 0.00 0.00 316142.98 14216.27 416204.01 00:29:48.292 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:48.292 Verification LBA range: start 0x100 length 0x100 00:29:48.292 crypto_ram4 : 5.71 290.67 18.17 0.00 0.00 382865.76 22483.89 480731.77 00:29:48.292 =================================================================================================================== 00:29:48.292 Total : 1442.53 90.16 0.00 0.00 658528.42 12754.31 2735976.76 00:29:48.292 00:29:48.292 real 0m8.719s 00:29:48.292 user 0m16.766s 00:29:48.292 sys 0m0.309s 00:29:48.292 08:05:33 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:48.292 08:05:33 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:29:48.292 ************************************ 00:29:48.292 END TEST bdev_verify_big_io 00:29:48.292 ************************************ 00:29:48.553 08:05:33 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:29:48.553 08:05:33 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:48.553 08:05:33 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:29:48.553 08:05:33 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:48.553 08:05:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:48.553 ************************************ 00:29:48.553 START TEST bdev_write_zeroes 00:29:48.553 ************************************ 00:29:48.553 08:05:33 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:48.553 [2024-07-15 08:05:33.179365] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:48.553 [2024-07-15 08:05:33.179418] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1801314 ] 00:29:48.553 [2024-07-15 08:05:33.269459] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:48.813 [2024-07-15 08:05:33.345658] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:48.813 [2024-07-15 08:05:33.366671] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:29:48.813 [2024-07-15 08:05:33.374695] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:48.813 [2024-07-15 08:05:33.382716] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:48.813 [2024-07-15 08:05:33.475053] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:29:51.352 [2024-07-15 08:05:35.641744] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:29:51.352 [2024-07-15 08:05:35.641793] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:51.352 [2024-07-15 08:05:35.641802] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.352 [2024-07-15 08:05:35.649757] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:29:51.352 [2024-07-15 08:05:35.649768] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:51.352 [2024-07-15 08:05:35.649774] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.352 [2024-07-15 08:05:35.657777] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:29:51.352 [2024-07-15 08:05:35.657787] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:51.352 [2024-07-15 08:05:35.657793] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.352 [2024-07-15 08:05:35.665796] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:29:51.352 [2024-07-15 08:05:35.665806] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:51.352 [2024-07-15 08:05:35.665811] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.352 Running I/O for 1 seconds... 00:29:52.292 00:29:52.292 Latency(us) 00:29:52.292 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:52.292 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:52.292 crypto_ram : 1.02 2357.11 9.21 0.00 0.00 54066.05 4663.14 64124.46 00:29:52.292 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:52.292 crypto_ram2 : 1.02 2362.71 9.23 0.00 0.00 53684.82 4612.73 59688.17 00:29:52.292 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:52.292 crypto_ram3 : 1.02 18240.60 71.25 0.00 0.00 6928.77 2117.32 8872.57 00:29:52.292 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:52.292 crypto_ram4 : 1.02 18223.82 71.19 0.00 0.00 6909.77 2029.10 7208.96 00:29:52.292 =================================================================================================================== 00:29:52.292 Total : 41184.25 160.88 0.00 0.00 12319.58 2029.10 64124.46 00:29:52.292 00:29:52.292 real 0m3.888s 00:29:52.292 user 0m3.602s 00:29:52.292 sys 0m0.245s 00:29:52.292 08:05:37 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:52.292 08:05:37 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:29:52.292 ************************************ 00:29:52.292 END TEST bdev_write_zeroes 00:29:52.292 ************************************ 00:29:52.292 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:29:52.292 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:52.292 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:29:52.292 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:52.292 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:52.553 ************************************ 00:29:52.553 START TEST bdev_json_nonenclosed 00:29:52.553 ************************************ 00:29:52.553 08:05:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:52.553 [2024-07-15 08:05:37.148352] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:52.553 [2024-07-15 08:05:37.148416] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1802020 ] 00:29:52.553 [2024-07-15 08:05:37.238957] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:52.814 [2024-07-15 08:05:37.316282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:52.814 [2024-07-15 08:05:37.316339] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:29:52.814 [2024-07-15 08:05:37.316350] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:52.814 [2024-07-15 08:05:37.316357] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:52.814 00:29:52.814 real 0m0.293s 00:29:52.814 user 0m0.181s 00:29:52.814 sys 0m0.110s 00:29:52.814 08:05:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:29:52.814 08:05:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:52.814 08:05:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:29:52.814 ************************************ 00:29:52.814 END TEST bdev_json_nonenclosed 00:29:52.814 ************************************ 00:29:52.814 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:29:52.814 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:29:52.814 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:52.814 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:29:52.814 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:52.814 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:52.814 ************************************ 00:29:52.814 START TEST bdev_json_nonarray 00:29:52.814 ************************************ 00:29:52.814 08:05:37 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:52.814 [2024-07-15 08:05:37.504812] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:52.814 [2024-07-15 08:05:37.504859] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1802203 ] 00:29:53.074 [2024-07-15 08:05:37.593197] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:53.074 [2024-07-15 08:05:37.669058] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:53.074 [2024-07-15 08:05:37.669119] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:29:53.074 [2024-07-15 08:05:37.669131] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:53.074 [2024-07-15 08:05:37.669138] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:53.074 00:29:53.074 real 0m0.275s 00:29:53.074 user 0m0.168s 00:29:53.074 sys 0m0.106s 00:29:53.074 08:05:37 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:29:53.074 08:05:37 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:53.074 08:05:37 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:29:53.074 ************************************ 00:29:53.074 END TEST bdev_json_nonarray 00:29:53.074 ************************************ 00:29:53.074 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:29:53.074 08:05:37 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:29:53.074 00:29:53.074 real 1m8.385s 00:29:53.074 user 2m45.416s 00:29:53.074 sys 0m6.198s 00:29:53.074 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:53.074 08:05:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:29:53.074 ************************************ 00:29:53.074 END TEST blockdev_crypto_aesni 00:29:53.074 ************************************ 00:29:53.074 08:05:37 -- common/autotest_common.sh@1142 -- # return 0 00:29:53.074 08:05:37 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:29:53.074 08:05:37 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:53.074 08:05:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:53.074 08:05:37 -- common/autotest_common.sh@10 -- # set +x 00:29:53.334 ************************************ 00:29:53.334 START TEST blockdev_crypto_sw 00:29:53.334 ************************************ 00:29:53.334 08:05:37 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:29:53.334 * Looking for test storage... 00:29:53.334 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:29:53.334 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1802305 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1802305 00:29:53.335 08:05:37 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:29:53.335 08:05:37 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 1802305 ']' 00:29:53.335 08:05:37 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:53.335 08:05:37 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:53.335 08:05:37 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:53.335 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:53.335 08:05:37 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:53.335 08:05:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:53.335 [2024-07-15 08:05:38.027182] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:53.335 [2024-07-15 08:05:38.027243] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1802305 ] 00:29:53.595 [2024-07-15 08:05:38.119286] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:53.595 [2024-07-15 08:05:38.189263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:54.164 08:05:38 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:54.164 08:05:38 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:29:54.164 08:05:38 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:29:54.164 08:05:38 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:29:54.164 08:05:38 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:29:54.164 08:05:38 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.164 08:05:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:54.425 Malloc0 00:29:54.425 Malloc1 00:29:54.425 true 00:29:54.425 true 00:29:54.425 true 00:29:54.425 [2024-07-15 08:05:39.061272] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:54.425 crypto_ram 00:29:54.425 [2024-07-15 08:05:39.069294] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:54.425 crypto_ram2 00:29:54.425 [2024-07-15 08:05:39.077317] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:54.425 crypto_ram3 00:29:54.425 [ 00:29:54.425 { 00:29:54.425 "name": "Malloc1", 00:29:54.425 "aliases": [ 00:29:54.425 "a8f41a7d-0deb-43ea-967c-75dacec61c4a" 00:29:54.425 ], 00:29:54.425 "product_name": "Malloc disk", 00:29:54.425 "block_size": 4096, 00:29:54.425 "num_blocks": 4096, 00:29:54.425 "uuid": "a8f41a7d-0deb-43ea-967c-75dacec61c4a", 00:29:54.425 "assigned_rate_limits": { 00:29:54.425 "rw_ios_per_sec": 0, 00:29:54.425 "rw_mbytes_per_sec": 0, 00:29:54.425 "r_mbytes_per_sec": 0, 00:29:54.425 "w_mbytes_per_sec": 0 00:29:54.425 }, 00:29:54.425 "claimed": true, 00:29:54.425 "claim_type": "exclusive_write", 00:29:54.425 "zoned": false, 00:29:54.425 "supported_io_types": { 00:29:54.425 "read": true, 00:29:54.425 "write": true, 00:29:54.425 "unmap": true, 00:29:54.425 "flush": true, 00:29:54.425 "reset": true, 00:29:54.425 "nvme_admin": false, 00:29:54.425 "nvme_io": false, 00:29:54.425 "nvme_io_md": false, 00:29:54.425 "write_zeroes": true, 00:29:54.425 "zcopy": true, 00:29:54.425 "get_zone_info": false, 00:29:54.425 "zone_management": false, 00:29:54.425 "zone_append": false, 00:29:54.425 "compare": false, 00:29:54.425 "compare_and_write": false, 00:29:54.425 "abort": true, 00:29:54.425 "seek_hole": false, 00:29:54.425 "seek_data": false, 00:29:54.425 "copy": true, 00:29:54.425 "nvme_iov_md": false 00:29:54.425 }, 00:29:54.425 "memory_domains": [ 00:29:54.425 { 00:29:54.425 "dma_device_id": "system", 00:29:54.425 "dma_device_type": 1 00:29:54.425 }, 00:29:54.425 { 00:29:54.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:54.425 "dma_device_type": 2 00:29:54.425 } 00:29:54.425 ], 00:29:54.425 "driver_specific": {} 00:29:54.425 } 00:29:54.425 ] 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.425 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.425 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:29:54.425 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.425 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.425 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.425 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:29:54.425 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:29:54.425 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:54.425 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:54.711 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:54.711 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:29:54.711 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:29:54.712 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "7570e1c4-64fb-5848-b128-a6360cf67e45"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7570e1c4-64fb-5848-b128-a6360cf67e45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "25e122fc-1f28-5f12-8562-98a249a224f3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "25e122fc-1f28-5f12-8562-98a249a224f3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:29:54.712 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:29:54.712 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:29:54.712 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:29:54.712 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 1802305 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 1802305 ']' 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 1802305 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1802305 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1802305' 00:29:54.712 killing process with pid 1802305 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 1802305 00:29:54.712 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 1802305 00:29:54.971 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:54.971 08:05:39 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:54.971 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:54.971 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:54.971 08:05:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:54.971 ************************************ 00:29:54.971 START TEST bdev_hello_world 00:29:54.971 ************************************ 00:29:54.971 08:05:39 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:29:54.971 [2024-07-15 08:05:39.634953] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:54.971 [2024-07-15 08:05:39.634998] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1802617 ] 00:29:54.971 [2024-07-15 08:05:39.722329] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.231 [2024-07-15 08:05:39.792287] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.231 [2024-07-15 08:05:39.929636] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:55.231 [2024-07-15 08:05:39.929688] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:55.231 [2024-07-15 08:05:39.929696] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:55.231 [2024-07-15 08:05:39.937653] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:55.231 [2024-07-15 08:05:39.937665] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:55.231 [2024-07-15 08:05:39.937671] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:55.231 [2024-07-15 08:05:39.945675] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:55.231 [2024-07-15 08:05:39.945685] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:55.231 [2024-07-15 08:05:39.945691] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:55.231 [2024-07-15 08:05:39.982442] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:29:55.231 [2024-07-15 08:05:39.982464] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:29:55.231 [2024-07-15 08:05:39.982474] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:29:55.231 [2024-07-15 08:05:39.983740] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:29:55.231 [2024-07-15 08:05:39.983794] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:29:55.231 [2024-07-15 08:05:39.983803] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:29:55.231 [2024-07-15 08:05:39.983826] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:29:55.231 00:29:55.231 [2024-07-15 08:05:39.983836] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:29:55.490 00:29:55.490 real 0m0.538s 00:29:55.490 user 0m0.381s 00:29:55.490 sys 0m0.146s 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:29:55.490 ************************************ 00:29:55.490 END TEST bdev_hello_world 00:29:55.490 ************************************ 00:29:55.490 08:05:40 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:29:55.490 08:05:40 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:29:55.490 08:05:40 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:55.490 08:05:40 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:55.490 08:05:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:55.490 ************************************ 00:29:55.490 START TEST bdev_bounds 00:29:55.490 ************************************ 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1802652 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1802652' 00:29:55.490 Process bdevio pid: 1802652 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1802652 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1802652 ']' 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:55.490 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:55.490 08:05:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:55.490 [2024-07-15 08:05:40.235532] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:55.490 [2024-07-15 08:05:40.235575] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1802652 ] 00:29:55.750 [2024-07-15 08:05:40.323630] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:55.750 [2024-07-15 08:05:40.390323] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:55.750 [2024-07-15 08:05:40.390466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.750 [2024-07-15 08:05:40.390467] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:56.010 [2024-07-15 08:05:40.529888] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:56.010 [2024-07-15 08:05:40.529932] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:56.010 [2024-07-15 08:05:40.529941] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:56.010 [2024-07-15 08:05:40.537907] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:56.010 [2024-07-15 08:05:40.537917] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:56.010 [2024-07-15 08:05:40.537923] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:56.010 [2024-07-15 08:05:40.545929] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:56.010 [2024-07-15 08:05:40.545939] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:56.010 [2024-07-15 08:05:40.545945] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:56.582 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:56.582 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:29:56.582 08:05:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:56.582 I/O targets: 00:29:56.582 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:29:56.582 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:29:56.582 00:29:56.582 00:29:56.582 CUnit - A unit testing framework for C - Version 2.1-3 00:29:56.582 http://cunit.sourceforge.net/ 00:29:56.582 00:29:56.582 00:29:56.582 Suite: bdevio tests on: crypto_ram3 00:29:56.582 Test: blockdev write read block ...passed 00:29:56.582 Test: blockdev write zeroes read block ...passed 00:29:56.582 Test: blockdev write zeroes read no split ...passed 00:29:56.582 Test: blockdev write zeroes read split ...passed 00:29:56.582 Test: blockdev write zeroes read split partial ...passed 00:29:56.582 Test: blockdev reset ...passed 00:29:56.582 Test: blockdev write read 8 blocks ...passed 00:29:56.582 Test: blockdev write read size > 128k ...passed 00:29:56.582 Test: blockdev write read invalid size ...passed 00:29:56.582 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:56.582 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:56.582 Test: blockdev write read max offset ...passed 00:29:56.582 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:56.582 Test: blockdev writev readv 8 blocks ...passed 00:29:56.582 Test: blockdev writev readv 30 x 1block ...passed 00:29:56.582 Test: blockdev writev readv block ...passed 00:29:56.582 Test: blockdev writev readv size > 128k ...passed 00:29:56.582 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:56.582 Test: blockdev comparev and writev ...passed 00:29:56.582 Test: blockdev nvme passthru rw ...passed 00:29:56.582 Test: blockdev nvme passthru vendor specific ...passed 00:29:56.582 Test: blockdev nvme admin passthru ...passed 00:29:56.582 Test: blockdev copy ...passed 00:29:56.582 Suite: bdevio tests on: crypto_ram 00:29:56.582 Test: blockdev write read block ...passed 00:29:56.582 Test: blockdev write zeroes read block ...passed 00:29:56.582 Test: blockdev write zeroes read no split ...passed 00:29:56.582 Test: blockdev write zeroes read split ...passed 00:29:56.582 Test: blockdev write zeroes read split partial ...passed 00:29:56.582 Test: blockdev reset ...passed 00:29:56.582 Test: blockdev write read 8 blocks ...passed 00:29:56.582 Test: blockdev write read size > 128k ...passed 00:29:56.582 Test: blockdev write read invalid size ...passed 00:29:56.582 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:56.582 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:56.582 Test: blockdev write read max offset ...passed 00:29:56.582 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:56.582 Test: blockdev writev readv 8 blocks ...passed 00:29:56.582 Test: blockdev writev readv 30 x 1block ...passed 00:29:56.582 Test: blockdev writev readv block ...passed 00:29:56.582 Test: blockdev writev readv size > 128k ...passed 00:29:56.582 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:56.582 Test: blockdev comparev and writev ...passed 00:29:56.582 Test: blockdev nvme passthru rw ...passed 00:29:56.582 Test: blockdev nvme passthru vendor specific ...passed 00:29:56.582 Test: blockdev nvme admin passthru ...passed 00:29:56.582 Test: blockdev copy ...passed 00:29:56.582 00:29:56.582 Run Summary: Type Total Ran Passed Failed Inactive 00:29:56.583 suites 2 2 n/a 0 0 00:29:56.583 tests 46 46 46 0 0 00:29:56.583 asserts 260 260 260 0 n/a 00:29:56.583 00:29:56.583 Elapsed time = 0.147 seconds 00:29:56.583 0 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1802652 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1802652 ']' 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1802652 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1802652 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1802652' 00:29:56.583 killing process with pid 1802652 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1802652 00:29:56.583 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1802652 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:29:56.843 00:29:56.843 real 0m1.268s 00:29:56.843 user 0m3.463s 00:29:56.843 sys 0m0.263s 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:29:56.843 ************************************ 00:29:56.843 END TEST bdev_bounds 00:29:56.843 ************************************ 00:29:56.843 08:05:41 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:29:56.843 08:05:41 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:29:56.843 08:05:41 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:29:56.843 08:05:41 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:56.843 08:05:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:29:56.843 ************************************ 00:29:56.843 START TEST bdev_nbd 00:29:56.843 ************************************ 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:29:56.843 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1802981 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1802981 /var/tmp/spdk-nbd.sock 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1802981 ']' 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:29:56.844 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:56.844 08:05:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:56.844 [2024-07-15 08:05:41.594261] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:29:56.844 [2024-07-15 08:05:41.594325] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:57.104 [2024-07-15 08:05:41.685789] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.104 [2024-07-15 08:05:41.749917] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.364 [2024-07-15 08:05:41.893426] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:29:57.364 [2024-07-15 08:05:41.893472] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:57.364 [2024-07-15 08:05:41.893481] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:57.364 [2024-07-15 08:05:41.901443] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:29:57.364 [2024-07-15 08:05:41.901458] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:57.364 [2024-07-15 08:05:41.901464] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:57.364 [2024-07-15 08:05:41.909463] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:29:57.364 [2024-07-15 08:05:41.909473] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:29:57.364 [2024-07-15 08:05:41.909479] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:57.933 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:57.934 1+0 records in 00:29:57.934 1+0 records out 00:29:57.934 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212744 s, 19.3 MB/s 00:29:57.934 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:57.934 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:57.934 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:57.934 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:57.934 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:57.934 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:57.934 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:57.934 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:58.194 1+0 records in 00:29:58.194 1+0 records out 00:29:58.194 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310282 s, 13.2 MB/s 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:29:58.194 08:05:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:29:58.455 { 00:29:58.455 "nbd_device": "/dev/nbd0", 00:29:58.455 "bdev_name": "crypto_ram" 00:29:58.455 }, 00:29:58.455 { 00:29:58.455 "nbd_device": "/dev/nbd1", 00:29:58.455 "bdev_name": "crypto_ram3" 00:29:58.455 } 00:29:58.455 ]' 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:29:58.455 { 00:29:58.455 "nbd_device": "/dev/nbd0", 00:29:58.455 "bdev_name": "crypto_ram" 00:29:58.455 }, 00:29:58.455 { 00:29:58.455 "nbd_device": "/dev/nbd1", 00:29:58.455 "bdev_name": "crypto_ram3" 00:29:58.455 } 00:29:58.455 ]' 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:58.455 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:58.714 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:58.714 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:58.714 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:58.714 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:58.715 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:58.715 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:58.715 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:58.715 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:58.715 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:58.715 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:58.974 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:29:59.234 /dev/nbd0 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:59.234 1+0 records in 00:29:59.234 1+0 records out 00:29:59.234 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273236 s, 15.0 MB/s 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:59.234 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:59.494 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:59.494 08:05:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:59.494 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:59.494 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:59.494 08:05:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:29:59.494 /dev/nbd1 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:59.494 1+0 records in 00:29:59.494 1+0 records out 00:29:59.494 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284727 s, 14.4 MB/s 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:59.494 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:29:59.753 { 00:29:59.753 "nbd_device": "/dev/nbd0", 00:29:59.753 "bdev_name": "crypto_ram" 00:29:59.753 }, 00:29:59.753 { 00:29:59.753 "nbd_device": "/dev/nbd1", 00:29:59.753 "bdev_name": "crypto_ram3" 00:29:59.753 } 00:29:59.753 ]' 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:29:59.753 { 00:29:59.753 "nbd_device": "/dev/nbd0", 00:29:59.753 "bdev_name": "crypto_ram" 00:29:59.753 }, 00:29:59.753 { 00:29:59.753 "nbd_device": "/dev/nbd1", 00:29:59.753 "bdev_name": "crypto_ram3" 00:29:59.753 } 00:29:59.753 ]' 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:29:59.753 /dev/nbd1' 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:29:59.753 /dev/nbd1' 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:29:59.753 256+0 records in 00:29:59.753 256+0 records out 00:29:59.753 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0119298 s, 87.9 MB/s 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:29:59.753 256+0 records in 00:29:59.753 256+0 records out 00:29:59.753 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0180741 s, 58.0 MB/s 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:59.753 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:30:00.013 256+0 records in 00:30:00.013 256+0 records out 00:30:00.013 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0258919 s, 40.5 MB/s 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:00.013 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:00.273 08:05:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:00.533 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:00.533 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:00.533 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:00.533 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:30:00.534 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:30:00.808 malloc_lvol_verify 00:30:00.808 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:30:01.145 350c7d09-ebf5-4c80-b2ef-15585d53689b 00:30:01.145 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:30:01.145 b844924f-8aa5-446f-8bed-0dd99def252e 00:30:01.146 08:05:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:30:01.406 /dev/nbd0 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:30:01.406 mke2fs 1.46.5 (30-Dec-2021) 00:30:01.406 Discarding device blocks: 0/4096 done 00:30:01.406 Creating filesystem with 4096 1k blocks and 1024 inodes 00:30:01.406 00:30:01.406 Allocating group tables: 0/1 done 00:30:01.406 Writing inode tables: 0/1 done 00:30:01.406 Creating journal (1024 blocks): done 00:30:01.406 Writing superblocks and filesystem accounting information: 0/1 done 00:30:01.406 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:01.406 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1802981 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1802981 ']' 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1802981 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1802981 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1802981' 00:30:01.668 killing process with pid 1802981 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1802981 00:30:01.668 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1802981 00:30:01.929 08:05:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:30:01.929 00:30:01.929 real 0m5.108s 00:30:01.929 user 0m7.622s 00:30:01.929 sys 0m1.493s 00:30:01.929 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:01.929 08:05:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:01.929 ************************************ 00:30:01.929 END TEST bdev_nbd 00:30:01.929 ************************************ 00:30:01.929 08:05:46 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:01.929 08:05:46 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:30:01.929 08:05:46 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:30:01.929 08:05:46 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:30:01.929 08:05:46 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:30:01.929 08:05:46 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:01.929 08:05:46 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:01.929 08:05:46 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:02.190 ************************************ 00:30:02.190 START TEST bdev_fio 00:30:02.190 ************************************ 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:02.190 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:02.190 ************************************ 00:30:02.190 START TEST bdev_fio_rw_verify 00:30:02.190 ************************************ 00:30:02.190 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:02.191 08:05:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:02.451 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:02.451 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:02.451 fio-3.35 00:30:02.451 Starting 2 threads 00:30:14.684 00:30:14.684 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1804189: Mon Jul 15 08:05:57 2024 00:30:14.684 read: IOPS=30.0k, BW=117MiB/s (123MB/s)(1173MiB/10000msec) 00:30:14.684 slat (usec): min=10, max=216, avg=13.87, stdev= 3.53 00:30:14.684 clat (usec): min=5, max=643, avg=104.82, stdev=41.37 00:30:14.684 lat (usec): min=17, max=657, avg=118.68, stdev=42.31 00:30:14.684 clat percentiles (usec): 00:30:14.684 | 50.000th=[ 103], 99.000th=[ 200], 99.900th=[ 229], 99.990th=[ 258], 00:30:14.684 | 99.999th=[ 603] 00:30:14.684 write: IOPS=36.1k, BW=141MiB/s (148MB/s)(1336MiB/9475msec); 0 zone resets 00:30:14.684 slat (usec): min=10, max=1773, avg=24.58, stdev= 5.18 00:30:14.684 clat (usec): min=6, max=2041, avg=143.54, stdev=66.01 00:30:14.684 lat (usec): min=27, max=2069, avg=168.13, stdev=67.37 00:30:14.684 clat percentiles (usec): 00:30:14.684 | 50.000th=[ 141], 99.000th=[ 289], 99.900th=[ 326], 99.990th=[ 586], 00:30:14.684 | 99.999th=[ 1958] 00:30:14.684 bw ( KiB/s): min=127192, max=143896, per=94.66%, avg=136696.00, stdev=2608.07, samples=38 00:30:14.684 iops : min=31798, max=35974, avg=34174.00, stdev=652.02, samples=38 00:30:14.684 lat (usec) : 10=0.01%, 20=0.01%, 50=8.26%, 100=29.56%, 250=58.32% 00:30:14.684 lat (usec) : 500=3.84%, 750=0.01%, 1000=0.01% 00:30:14.684 lat (msec) : 2=0.01%, 4=0.01% 00:30:14.684 cpu : usr=99.71%, sys=0.00%, ctx=32, majf=0, minf=534 00:30:14.684 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:14.684 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:14.684 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:14.684 issued rwts: total=300401,342056,0,0 short=0,0,0,0 dropped=0,0,0,0 00:30:14.684 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:14.684 00:30:14.684 Run status group 0 (all jobs): 00:30:14.684 READ: bw=117MiB/s (123MB/s), 117MiB/s-117MiB/s (123MB/s-123MB/s), io=1173MiB (1230MB), run=10000-10000msec 00:30:14.684 WRITE: bw=141MiB/s (148MB/s), 141MiB/s-141MiB/s (148MB/s-148MB/s), io=1336MiB (1401MB), run=9475-9475msec 00:30:14.684 00:30:14.684 real 0m10.995s 00:30:14.684 user 0m27.124s 00:30:14.684 sys 0m0.312s 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:30:14.684 ************************************ 00:30:14.684 END TEST bdev_fio_rw_verify 00:30:14.684 ************************************ 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:30:14.684 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "7570e1c4-64fb-5848-b128-a6360cf67e45"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7570e1c4-64fb-5848-b128-a6360cf67e45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "25e122fc-1f28-5f12-8562-98a249a224f3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "25e122fc-1f28-5f12-8562-98a249a224f3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:30:14.685 crypto_ram3 ]] 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "7570e1c4-64fb-5848-b128-a6360cf67e45"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7570e1c4-64fb-5848-b128-a6360cf67e45",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "25e122fc-1f28-5f12-8562-98a249a224f3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "25e122fc-1f28-5f12-8562-98a249a224f3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:14.685 ************************************ 00:30:14.685 START TEST bdev_fio_trim 00:30:14.685 ************************************ 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:30:14.685 08:05:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:14.685 08:05:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:14.685 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:14.685 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:14.685 fio-3.35 00:30:14.685 Starting 2 threads 00:30:24.699 00:30:24.699 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1806178: Mon Jul 15 08:06:08 2024 00:30:24.699 write: IOPS=56.8k, BW=222MiB/s (233MB/s)(2218MiB/10001msec); 0 zone resets 00:30:24.699 slat (usec): min=10, max=422, avg=14.91, stdev= 4.46 00:30:24.699 clat (usec): min=31, max=2007, avg=117.12, stdev=65.28 00:30:24.700 lat (usec): min=42, max=2030, avg=132.02, stdev=67.68 00:30:24.700 clat percentiles (usec): 00:30:24.700 | 50.000th=[ 93], 99.000th=[ 249], 99.900th=[ 281], 99.990th=[ 570], 00:30:24.700 | 99.999th=[ 1926] 00:30:24.700 bw ( KiB/s): min=211424, max=232144, per=99.97%, avg=227055.58, stdev=3102.55, samples=38 00:30:24.700 iops : min=52856, max=58036, avg=56763.89, stdev=775.64, samples=38 00:30:24.700 trim: IOPS=56.8k, BW=222MiB/s (233MB/s)(2218MiB/10001msec); 0 zone resets 00:30:24.700 slat (usec): min=4, max=115, avg= 7.03, stdev= 2.42 00:30:24.700 clat (usec): min=35, max=1850, avg=78.20, stdev=24.07 00:30:24.700 lat (usec): min=42, max=1858, avg=85.23, stdev=24.18 00:30:24.700 clat percentiles (usec): 00:30:24.700 | 50.000th=[ 79], 99.000th=[ 133], 99.900th=[ 155], 99.990th=[ 289], 00:30:24.700 | 99.999th=[ 529] 00:30:24.700 bw ( KiB/s): min=211424, max=232152, per=99.97%, avg=227056.84, stdev=3101.58, samples=38 00:30:24.700 iops : min=52856, max=58038, avg=56764.21, stdev=775.39, samples=38 00:30:24.700 lat (usec) : 50=15.00%, 100=52.24%, 250=32.32%, 500=0.44%, 750=0.01% 00:30:24.700 lat (msec) : 2=0.01%, 4=0.01% 00:30:24.700 cpu : usr=99.69%, sys=0.00%, ctx=28, majf=0, minf=207 00:30:24.700 IO depths : 1=7.5%, 2=17.5%, 4=60.0%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:24.700 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:24.700 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:24.700 issued rwts: total=0,567855,567856,0 short=0,0,0,0 dropped=0,0,0,0 00:30:24.700 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:24.700 00:30:24.700 Run status group 0 (all jobs): 00:30:24.700 WRITE: bw=222MiB/s (233MB/s), 222MiB/s-222MiB/s (233MB/s-233MB/s), io=2218MiB (2326MB), run=10001-10001msec 00:30:24.700 TRIM: bw=222MiB/s (233MB/s), 222MiB/s-222MiB/s (233MB/s-233MB/s), io=2218MiB (2326MB), run=10001-10001msec 00:30:24.700 00:30:24.700 real 0m11.058s 00:30:24.700 user 0m26.657s 00:30:24.700 sys 0m0.351s 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:30:24.700 ************************************ 00:30:24.700 END TEST bdev_fio_trim 00:30:24.700 ************************************ 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:30:24.700 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:30:24.700 00:30:24.700 real 0m22.388s 00:30:24.700 user 0m53.975s 00:30:24.700 sys 0m0.820s 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:24.700 ************************************ 00:30:24.700 END TEST bdev_fio 00:30:24.700 ************************************ 00:30:24.700 08:06:09 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:24.700 08:06:09 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:24.700 08:06:09 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:24.700 08:06:09 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:30:24.700 08:06:09 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:24.700 08:06:09 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:24.700 ************************************ 00:30:24.700 START TEST bdev_verify 00:30:24.700 ************************************ 00:30:24.700 08:06:09 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:24.700 [2024-07-15 08:06:09.238346] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:24.700 [2024-07-15 08:06:09.238405] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1807867 ] 00:30:24.700 [2024-07-15 08:06:09.316640] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:24.700 [2024-07-15 08:06:09.411806] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:24.700 [2024-07-15 08:06:09.411872] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:24.960 [2024-07-15 08:06:09.589550] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:24.961 [2024-07-15 08:06:09.589620] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:24.961 [2024-07-15 08:06:09.589630] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:24.961 [2024-07-15 08:06:09.597566] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:30:24.961 [2024-07-15 08:06:09.597580] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:24.961 [2024-07-15 08:06:09.597586] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:24.961 [2024-07-15 08:06:09.605585] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:30:24.961 [2024-07-15 08:06:09.605598] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:30:24.961 [2024-07-15 08:06:09.605604] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:24.961 Running I/O for 5 seconds... 00:30:30.245 00:30:30.245 Latency(us) 00:30:30.245 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:30.245 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:30.245 Verification LBA range: start 0x0 length 0x800 00:30:30.245 crypto_ram : 5.01 7817.57 30.54 0.00 0.00 16309.64 1247.70 20265.75 00:30:30.245 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:30.245 Verification LBA range: start 0x800 length 0x800 00:30:30.245 crypto_ram : 5.01 6411.45 25.04 0.00 0.00 19881.61 1468.26 22483.89 00:30:30.245 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:30.245 Verification LBA range: start 0x0 length 0x800 00:30:30.245 crypto_ram3 : 5.02 3925.67 15.33 0.00 0.00 32451.31 1455.66 22988.01 00:30:30.245 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:30.245 Verification LBA range: start 0x800 length 0x800 00:30:30.245 crypto_ram3 : 5.02 3214.28 12.56 0.00 0.00 39611.33 1714.02 27424.30 00:30:30.245 =================================================================================================================== 00:30:30.245 Total : 21368.97 83.47 0.00 0.00 23858.50 1247.70 27424.30 00:30:30.245 00:30:30.245 real 0m5.638s 00:30:30.245 user 0m10.722s 00:30:30.245 sys 0m0.189s 00:30:30.245 08:06:14 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:30.245 08:06:14 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:30:30.245 ************************************ 00:30:30.245 END TEST bdev_verify 00:30:30.245 ************************************ 00:30:30.245 08:06:14 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:30.245 08:06:14 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:30.245 08:06:14 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:30:30.245 08:06:14 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:30.245 08:06:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:30.245 ************************************ 00:30:30.245 START TEST bdev_verify_big_io 00:30:30.245 ************************************ 00:30:30.245 08:06:14 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:30.245 [2024-07-15 08:06:14.961298] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:30.245 [2024-07-15 08:06:14.961359] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1809248 ] 00:30:30.505 [2024-07-15 08:06:15.030936] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:30.505 [2024-07-15 08:06:15.097733] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:30.505 [2024-07-15 08:06:15.097822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:30.505 [2024-07-15 08:06:15.240397] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:30.505 [2024-07-15 08:06:15.240441] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:30.505 [2024-07-15 08:06:15.240449] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:30.505 [2024-07-15 08:06:15.248414] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:30:30.505 [2024-07-15 08:06:15.248426] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:30.505 [2024-07-15 08:06:15.248432] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:30.505 [2024-07-15 08:06:15.256434] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:30:30.505 [2024-07-15 08:06:15.256444] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:30:30.505 [2024-07-15 08:06:15.256456] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:30.765 Running I/O for 5 seconds... 00:30:36.044 00:30:36.044 Latency(us) 00:30:36.044 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:36.044 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:36.044 Verification LBA range: start 0x0 length 0x80 00:30:36.044 crypto_ram : 5.07 504.55 31.53 0.00 0.00 247607.07 3175.98 367808.20 00:30:36.044 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:36.044 Verification LBA range: start 0x80 length 0x80 00:30:36.044 crypto_ram : 5.24 415.39 25.96 0.00 0.00 300182.75 4688.34 406524.85 00:30:36.044 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:30:36.044 Verification LBA range: start 0x0 length 0x80 00:30:36.044 crypto_ram3 : 5.24 268.46 16.78 0.00 0.00 450941.10 3554.07 377487.36 00:30:36.044 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:30:36.044 Verification LBA range: start 0x80 length 0x80 00:30:36.044 crypto_ram3 : 5.25 219.48 13.72 0.00 0.00 546164.98 3755.72 419430.40 00:30:36.044 =================================================================================================================== 00:30:36.044 Total : 1407.89 87.99 0.00 0.00 349668.16 3175.98 419430.40 00:30:36.044 00:30:36.044 real 0m5.799s 00:30:36.044 user 0m11.151s 00:30:36.044 sys 0m0.136s 00:30:36.044 08:06:20 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:36.044 08:06:20 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:30:36.044 ************************************ 00:30:36.044 END TEST bdev_verify_big_io 00:30:36.044 ************************************ 00:30:36.044 08:06:20 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:36.044 08:06:20 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:36.044 08:06:20 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:30:36.044 08:06:20 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:36.044 08:06:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:36.044 ************************************ 00:30:36.044 START TEST bdev_write_zeroes 00:30:36.044 ************************************ 00:30:36.044 08:06:20 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:36.305 [2024-07-15 08:06:20.828062] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:36.305 [2024-07-15 08:06:20.828121] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1810175 ] 00:30:36.305 [2024-07-15 08:06:20.920399] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:36.305 [2024-07-15 08:06:20.995814] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.565 [2024-07-15 08:06:21.135995] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:36.565 [2024-07-15 08:06:21.136047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:36.565 [2024-07-15 08:06:21.136056] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.565 [2024-07-15 08:06:21.144012] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:30:36.565 [2024-07-15 08:06:21.144024] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:36.565 [2024-07-15 08:06:21.144042] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.565 [2024-07-15 08:06:21.152032] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:30:36.565 [2024-07-15 08:06:21.152043] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:30:36.565 [2024-07-15 08:06:21.152048] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.565 Running I/O for 1 seconds... 00:30:37.534 00:30:37.534 Latency(us) 00:30:37.534 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:37.534 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:37.534 crypto_ram : 1.01 33186.74 129.64 0.00 0.00 3847.79 1714.02 5394.12 00:30:37.534 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:30:37.534 crypto_ram3 : 1.01 16621.25 64.93 0.00 0.00 7652.15 2823.09 7914.73 00:30:37.534 =================================================================================================================== 00:30:37.534 Total : 49807.99 194.56 0.00 0.00 5119.15 1714.02 7914.73 00:30:37.795 00:30:37.795 real 0m1.556s 00:30:37.795 user 0m1.384s 00:30:37.795 sys 0m0.156s 00:30:37.795 08:06:22 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:37.795 08:06:22 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:30:37.795 ************************************ 00:30:37.795 END TEST bdev_write_zeroes 00:30:37.795 ************************************ 00:30:37.795 08:06:22 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:37.795 08:06:22 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:37.795 08:06:22 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:30:37.795 08:06:22 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:37.795 08:06:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:37.795 ************************************ 00:30:37.795 START TEST bdev_json_nonenclosed 00:30:37.795 ************************************ 00:30:37.795 08:06:22 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:37.795 [2024-07-15 08:06:22.460693] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:37.795 [2024-07-15 08:06:22.460753] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1810474 ] 00:30:38.056 [2024-07-15 08:06:22.552796] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:38.056 [2024-07-15 08:06:22.627144] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:38.056 [2024-07-15 08:06:22.627206] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:30:38.056 [2024-07-15 08:06:22.627217] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:30:38.056 [2024-07-15 08:06:22.627225] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:30:38.056 00:30:38.056 real 0m0.281s 00:30:38.056 user 0m0.172s 00:30:38.056 sys 0m0.107s 00:30:38.056 08:06:22 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:30:38.056 08:06:22 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:38.056 08:06:22 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:30:38.056 ************************************ 00:30:38.056 END TEST bdev_json_nonenclosed 00:30:38.056 ************************************ 00:30:38.056 08:06:22 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:30:38.056 08:06:22 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:30:38.056 08:06:22 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:38.056 08:06:22 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:30:38.056 08:06:22 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:38.056 08:06:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:38.056 ************************************ 00:30:38.056 START TEST bdev_json_nonarray 00:30:38.056 ************************************ 00:30:38.056 08:06:22 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:30:38.318 [2024-07-15 08:06:22.816992] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:38.318 [2024-07-15 08:06:22.817036] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1810539 ] 00:30:38.318 [2024-07-15 08:06:22.903775] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:38.318 [2024-07-15 08:06:22.972569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:38.318 [2024-07-15 08:06:22.972631] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:30:38.318 [2024-07-15 08:06:22.972642] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:30:38.318 [2024-07-15 08:06:22.972649] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:30:38.318 00:30:38.318 real 0m0.268s 00:30:38.318 user 0m0.169s 00:30:38.318 sys 0m0.097s 00:30:38.318 08:06:23 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:30:38.318 08:06:23 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:38.318 08:06:23 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:30:38.318 ************************************ 00:30:38.318 END TEST bdev_json_nonarray 00:30:38.318 ************************************ 00:30:38.318 08:06:23 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:30:38.318 08:06:23 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:30:38.318 08:06:23 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:30:38.318 08:06:23 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:30:38.318 08:06:23 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:30:38.318 08:06:23 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:30:38.318 08:06:23 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:38.318 08:06:23 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:38.318 08:06:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:38.580 ************************************ 00:30:38.580 START TEST bdev_crypto_enomem 00:30:38.580 ************************************ 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=1810721 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 1810721 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 1810721 ']' 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:38.580 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:38.580 08:06:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:38.580 [2024-07-15 08:06:23.163345] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:38.580 [2024-07-15 08:06:23.163395] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1810721 ] 00:30:38.580 [2024-07-15 08:06:23.246813] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:38.840 [2024-07-15 08:06:23.344969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:39.411 true 00:30:39.411 base0 00:30:39.411 true 00:30:39.411 [2024-07-15 08:06:24.067579] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:30:39.411 crypt0 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:39.411 [ 00:30:39.411 { 00:30:39.411 "name": "crypt0", 00:30:39.411 "aliases": [ 00:30:39.411 "4fb9511f-c4ef-52e2-8db9-30313ae9f94e" 00:30:39.411 ], 00:30:39.411 "product_name": "crypto", 00:30:39.411 "block_size": 512, 00:30:39.411 "num_blocks": 2097152, 00:30:39.411 "uuid": "4fb9511f-c4ef-52e2-8db9-30313ae9f94e", 00:30:39.411 "assigned_rate_limits": { 00:30:39.411 "rw_ios_per_sec": 0, 00:30:39.411 "rw_mbytes_per_sec": 0, 00:30:39.411 "r_mbytes_per_sec": 0, 00:30:39.411 "w_mbytes_per_sec": 0 00:30:39.411 }, 00:30:39.411 "claimed": false, 00:30:39.411 "zoned": false, 00:30:39.411 "supported_io_types": { 00:30:39.411 "read": true, 00:30:39.411 "write": true, 00:30:39.411 "unmap": false, 00:30:39.411 "flush": false, 00:30:39.411 "reset": true, 00:30:39.411 "nvme_admin": false, 00:30:39.411 "nvme_io": false, 00:30:39.411 "nvme_io_md": false, 00:30:39.411 "write_zeroes": true, 00:30:39.411 "zcopy": false, 00:30:39.411 "get_zone_info": false, 00:30:39.411 "zone_management": false, 00:30:39.411 "zone_append": false, 00:30:39.411 "compare": false, 00:30:39.411 "compare_and_write": false, 00:30:39.411 "abort": false, 00:30:39.411 "seek_hole": false, 00:30:39.411 "seek_data": false, 00:30:39.411 "copy": false, 00:30:39.411 "nvme_iov_md": false 00:30:39.411 }, 00:30:39.411 "memory_domains": [ 00:30:39.411 { 00:30:39.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:39.411 "dma_device_type": 2 00:30:39.411 } 00:30:39.411 ], 00:30:39.411 "driver_specific": { 00:30:39.411 "crypto": { 00:30:39.411 "base_bdev_name": "EE_base0", 00:30:39.411 "name": "crypt0", 00:30:39.411 "key_name": "test_dek_sw" 00:30:39.411 } 00:30:39.411 } 00:30:39.411 } 00:30:39.411 ] 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=1810829 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:30:39.411 08:06:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:39.671 Running I/O for 5 seconds... 00:30:40.611 08:06:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:30:40.611 08:06:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:40.611 08:06:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:40.611 08:06:25 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:40.611 08:06:25 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 1810829 00:30:44.812 00:30:44.812 Latency(us) 00:30:44.812 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:44.812 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:30:44.812 crypt0 : 5.00 36110.11 141.06 0.00 0.00 882.13 437.96 1184.69 00:30:44.812 =================================================================================================================== 00:30:44.812 Total : 36110.11 141.06 0.00 0.00 882.13 437.96 1184.69 00:30:44.812 0 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 1810721 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 1810721 ']' 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 1810721 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1810721 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1810721' 00:30:44.812 killing process with pid 1810721 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 1810721 00:30:44.812 Received shutdown signal, test time was about 5.000000 seconds 00:30:44.812 00:30:44.812 Latency(us) 00:30:44.812 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:44.812 =================================================================================================================== 00:30:44.812 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 1810721 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:30:44.812 00:30:44.812 real 0m6.354s 00:30:44.812 user 0m6.605s 00:30:44.812 sys 0m0.309s 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:44.812 08:06:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:30:44.812 ************************************ 00:30:44.812 END TEST bdev_crypto_enomem 00:30:44.812 ************************************ 00:30:44.812 08:06:29 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:30:44.812 08:06:29 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:30:44.812 08:06:29 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:30:44.812 08:06:29 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:30:44.812 08:06:29 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:44.812 08:06:29 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:30:44.812 08:06:29 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:30:44.812 08:06:29 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:30:44.812 08:06:29 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:30:44.812 00:30:44.812 real 0m51.666s 00:30:44.812 user 1m37.777s 00:30:44.812 sys 0m4.698s 00:30:44.812 08:06:29 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:44.812 08:06:29 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:30:44.812 ************************************ 00:30:44.812 END TEST blockdev_crypto_sw 00:30:44.812 ************************************ 00:30:44.812 08:06:29 -- common/autotest_common.sh@1142 -- # return 0 00:30:44.812 08:06:29 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:30:44.812 08:06:29 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:44.812 08:06:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:44.812 08:06:29 -- common/autotest_common.sh@10 -- # set +x 00:30:45.073 ************************************ 00:30:45.073 START TEST blockdev_crypto_qat 00:30:45.073 ************************************ 00:30:45.073 08:06:29 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:30:45.073 * Looking for test storage... 00:30:45.073 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:45.073 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:30:45.073 08:06:29 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1811794 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1811794 00:30:45.074 08:06:29 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:30:45.074 08:06:29 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 1811794 ']' 00:30:45.074 08:06:29 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:45.074 08:06:29 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:45.074 08:06:29 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:45.074 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:45.074 08:06:29 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:45.074 08:06:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:45.074 [2024-07-15 08:06:29.769694] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:45.074 [2024-07-15 08:06:29.769766] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1811794 ] 00:30:45.334 [2024-07-15 08:06:29.862561] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:45.334 [2024-07-15 08:06:29.956463] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:45.905 08:06:30 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:45.905 08:06:30 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:30:45.905 08:06:30 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:30:45.905 08:06:30 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:30:45.905 08:06:30 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:30:45.905 08:06:30 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:45.905 08:06:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:45.905 [2024-07-15 08:06:30.634524] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:45.905 [2024-07-15 08:06:30.642553] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:45.905 [2024-07-15 08:06:30.650566] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:46.165 [2024-07-15 08:06:30.721945] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:48.707 true 00:30:48.707 true 00:30:48.707 true 00:30:48.707 true 00:30:48.707 Malloc0 00:30:48.707 Malloc1 00:30:48.707 Malloc2 00:30:48.707 Malloc3 00:30:48.708 [2024-07-15 08:06:33.148356] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:48.708 crypto_ram 00:30:48.708 [2024-07-15 08:06:33.156390] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:48.708 crypto_ram1 00:30:48.708 [2024-07-15 08:06:33.164394] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:48.708 crypto_ram2 00:30:48.708 [2024-07-15 08:06:33.172413] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:48.708 crypto_ram3 00:30:48.708 [ 00:30:48.708 { 00:30:48.708 "name": "Malloc1", 00:30:48.708 "aliases": [ 00:30:48.708 "771ba432-08e6-4dd5-bb96-c0512a8244f8" 00:30:48.708 ], 00:30:48.708 "product_name": "Malloc disk", 00:30:48.708 "block_size": 512, 00:30:48.708 "num_blocks": 65536, 00:30:48.708 "uuid": "771ba432-08e6-4dd5-bb96-c0512a8244f8", 00:30:48.708 "assigned_rate_limits": { 00:30:48.708 "rw_ios_per_sec": 0, 00:30:48.708 "rw_mbytes_per_sec": 0, 00:30:48.708 "r_mbytes_per_sec": 0, 00:30:48.708 "w_mbytes_per_sec": 0 00:30:48.708 }, 00:30:48.708 "claimed": true, 00:30:48.708 "claim_type": "exclusive_write", 00:30:48.708 "zoned": false, 00:30:48.708 "supported_io_types": { 00:30:48.708 "read": true, 00:30:48.708 "write": true, 00:30:48.708 "unmap": true, 00:30:48.708 "flush": true, 00:30:48.708 "reset": true, 00:30:48.708 "nvme_admin": false, 00:30:48.708 "nvme_io": false, 00:30:48.708 "nvme_io_md": false, 00:30:48.708 "write_zeroes": true, 00:30:48.708 "zcopy": true, 00:30:48.708 "get_zone_info": false, 00:30:48.708 "zone_management": false, 00:30:48.708 "zone_append": false, 00:30:48.708 "compare": false, 00:30:48.708 "compare_and_write": false, 00:30:48.708 "abort": true, 00:30:48.708 "seek_hole": false, 00:30:48.708 "seek_data": false, 00:30:48.708 "copy": true, 00:30:48.708 "nvme_iov_md": false 00:30:48.708 }, 00:30:48.708 "memory_domains": [ 00:30:48.708 { 00:30:48.708 "dma_device_id": "system", 00:30:48.708 "dma_device_type": 1 00:30:48.708 }, 00:30:48.708 { 00:30:48.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:48.708 "dma_device_type": 2 00:30:48.708 } 00:30:48.708 ], 00:30:48.708 "driver_specific": {} 00:30:48.708 } 00:30:48.708 ] 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4221b852-00ac-55c6-bdf3-ddd70841495a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4221b852-00ac-55c6-bdf3-ddd70841495a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d8b9dd12-a0f8-5ddc-b772-617ad2d57c58"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d8b9dd12-a0f8-5ddc-b772-617ad2d57c58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "97427d56-fcb2-5124-ac82-246d955dab31"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "97427d56-fcb2-5124-ac82-246d955dab31",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "23302043-cdc2-5a25-9f6d-d52d6e7e4273"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "23302043-cdc2-5a25-9f6d-d52d6e7e4273",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:30:48.708 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 1811794 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 1811794 ']' 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 1811794 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1811794 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1811794' 00:30:48.708 killing process with pid 1811794 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 1811794 00:30:48.708 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 1811794 00:30:49.277 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:49.277 08:06:33 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:49.277 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:30:49.277 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:49.277 08:06:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:49.277 ************************************ 00:30:49.277 START TEST bdev_hello_world 00:30:49.277 ************************************ 00:30:49.277 08:06:33 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:49.277 [2024-07-15 08:06:33.877194] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:49.277 [2024-07-15 08:06:33.877246] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1812419 ] 00:30:49.277 [2024-07-15 08:06:33.968277] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:49.537 [2024-07-15 08:06:34.062653] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:49.537 [2024-07-15 08:06:34.083777] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:49.537 [2024-07-15 08:06:34.091800] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:49.537 [2024-07-15 08:06:34.099813] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:49.537 [2024-07-15 08:06:34.203817] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:52.083 [2024-07-15 08:06:36.470872] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:52.083 [2024-07-15 08:06:36.470954] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:52.083 [2024-07-15 08:06:36.470965] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:52.083 [2024-07-15 08:06:36.478888] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:52.083 [2024-07-15 08:06:36.478902] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:52.083 [2024-07-15 08:06:36.478909] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:52.083 [2024-07-15 08:06:36.486907] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:52.083 [2024-07-15 08:06:36.486920] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:52.083 [2024-07-15 08:06:36.486926] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:52.083 [2024-07-15 08:06:36.494927] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:52.083 [2024-07-15 08:06:36.494939] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:52.083 [2024-07-15 08:06:36.494945] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:52.083 [2024-07-15 08:06:36.570422] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:30:52.083 [2024-07-15 08:06:36.570470] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:30:52.083 [2024-07-15 08:06:36.570482] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:30:52.083 [2024-07-15 08:06:36.571703] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:30:52.084 [2024-07-15 08:06:36.571796] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:30:52.084 [2024-07-15 08:06:36.571807] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:30:52.084 [2024-07-15 08:06:36.571845] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:30:52.084 00:30:52.084 [2024-07-15 08:06:36.571858] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:30:52.345 00:30:52.345 real 0m3.095s 00:30:52.345 user 0m2.683s 00:30:52.345 sys 0m0.364s 00:30:52.345 08:06:36 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:30:52.346 ************************************ 00:30:52.346 END TEST bdev_hello_world 00:30:52.346 ************************************ 00:30:52.346 08:06:36 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:30:52.346 08:06:36 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:30:52.346 08:06:36 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:52.346 08:06:36 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:52.346 08:06:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:52.346 ************************************ 00:30:52.346 START TEST bdev_bounds 00:30:52.346 ************************************ 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1813036 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1813036' 00:30:52.346 Process bdevio pid: 1813036 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1813036 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1813036 ']' 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:52.346 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:52.346 08:06:36 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:52.346 [2024-07-15 08:06:37.044051] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:52.346 [2024-07-15 08:06:37.044109] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1813036 ] 00:30:52.607 [2024-07-15 08:06:37.140084] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:52.607 [2024-07-15 08:06:37.235725] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:52.607 [2024-07-15 08:06:37.235769] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:52.607 [2024-07-15 08:06:37.235841] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:52.607 [2024-07-15 08:06:37.257147] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:52.607 [2024-07-15 08:06:37.265173] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:52.607 [2024-07-15 08:06:37.273193] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:52.868 [2024-07-15 08:06:37.372908] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:55.415 [2024-07-15 08:06:39.602022] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:55.415 [2024-07-15 08:06:39.602082] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:55.415 [2024-07-15 08:06:39.602091] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:55.415 [2024-07-15 08:06:39.610037] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:55.415 [2024-07-15 08:06:39.610048] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:55.415 [2024-07-15 08:06:39.610054] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:55.415 [2024-07-15 08:06:39.618060] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:55.415 [2024-07-15 08:06:39.618070] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:55.415 [2024-07-15 08:06:39.618075] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:55.415 [2024-07-15 08:06:39.626082] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:55.415 [2024-07-15 08:06:39.626092] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:55.415 [2024-07-15 08:06:39.626097] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:55.415 08:06:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:55.415 08:06:39 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:30:55.415 08:06:39 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:55.415 I/O targets: 00:30:55.415 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:30:55.415 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:30:55.415 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:30:55.415 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:30:55.415 00:30:55.415 00:30:55.415 CUnit - A unit testing framework for C - Version 2.1-3 00:30:55.415 http://cunit.sourceforge.net/ 00:30:55.415 00:30:55.415 00:30:55.415 Suite: bdevio tests on: crypto_ram3 00:30:55.415 Test: blockdev write read block ...passed 00:30:55.415 Test: blockdev write zeroes read block ...passed 00:30:55.415 Test: blockdev write zeroes read no split ...passed 00:30:55.415 Test: blockdev write zeroes read split ...passed 00:30:55.415 Test: blockdev write zeroes read split partial ...passed 00:30:55.415 Test: blockdev reset ...passed 00:30:55.415 Test: blockdev write read 8 blocks ...passed 00:30:55.415 Test: blockdev write read size > 128k ...passed 00:30:55.415 Test: blockdev write read invalid size ...passed 00:30:55.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:55.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:55.415 Test: blockdev write read max offset ...passed 00:30:55.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:55.415 Test: blockdev writev readv 8 blocks ...passed 00:30:55.415 Test: blockdev writev readv 30 x 1block ...passed 00:30:55.415 Test: blockdev writev readv block ...passed 00:30:55.415 Test: blockdev writev readv size > 128k ...passed 00:30:55.415 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:55.415 Test: blockdev comparev and writev ...passed 00:30:55.415 Test: blockdev nvme passthru rw ...passed 00:30:55.415 Test: blockdev nvme passthru vendor specific ...passed 00:30:55.415 Test: blockdev nvme admin passthru ...passed 00:30:55.415 Test: blockdev copy ...passed 00:30:55.415 Suite: bdevio tests on: crypto_ram2 00:30:55.415 Test: blockdev write read block ...passed 00:30:55.415 Test: blockdev write zeroes read block ...passed 00:30:55.415 Test: blockdev write zeroes read no split ...passed 00:30:55.415 Test: blockdev write zeroes read split ...passed 00:30:55.415 Test: blockdev write zeroes read split partial ...passed 00:30:55.415 Test: blockdev reset ...passed 00:30:55.415 Test: blockdev write read 8 blocks ...passed 00:30:55.415 Test: blockdev write read size > 128k ...passed 00:30:55.415 Test: blockdev write read invalid size ...passed 00:30:55.415 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:55.415 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:55.415 Test: blockdev write read max offset ...passed 00:30:55.415 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:55.415 Test: blockdev writev readv 8 blocks ...passed 00:30:55.415 Test: blockdev writev readv 30 x 1block ...passed 00:30:55.415 Test: blockdev writev readv block ...passed 00:30:55.416 Test: blockdev writev readv size > 128k ...passed 00:30:55.416 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:55.416 Test: blockdev comparev and writev ...passed 00:30:55.416 Test: blockdev nvme passthru rw ...passed 00:30:55.416 Test: blockdev nvme passthru vendor specific ...passed 00:30:55.416 Test: blockdev nvme admin passthru ...passed 00:30:55.416 Test: blockdev copy ...passed 00:30:55.416 Suite: bdevio tests on: crypto_ram1 00:30:55.416 Test: blockdev write read block ...passed 00:30:55.416 Test: blockdev write zeroes read block ...passed 00:30:55.416 Test: blockdev write zeroes read no split ...passed 00:30:55.416 Test: blockdev write zeroes read split ...passed 00:30:55.677 Test: blockdev write zeroes read split partial ...passed 00:30:55.677 Test: blockdev reset ...passed 00:30:55.677 Test: blockdev write read 8 blocks ...passed 00:30:55.677 Test: blockdev write read size > 128k ...passed 00:30:55.677 Test: blockdev write read invalid size ...passed 00:30:55.677 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:55.677 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:55.677 Test: blockdev write read max offset ...passed 00:30:55.677 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:55.677 Test: blockdev writev readv 8 blocks ...passed 00:30:55.677 Test: blockdev writev readv 30 x 1block ...passed 00:30:55.677 Test: blockdev writev readv block ...passed 00:30:55.677 Test: blockdev writev readv size > 128k ...passed 00:30:55.677 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:55.677 Test: blockdev comparev and writev ...passed 00:30:55.677 Test: blockdev nvme passthru rw ...passed 00:30:55.677 Test: blockdev nvme passthru vendor specific ...passed 00:30:55.677 Test: blockdev nvme admin passthru ...passed 00:30:55.677 Test: blockdev copy ...passed 00:30:55.677 Suite: bdevio tests on: crypto_ram 00:30:55.677 Test: blockdev write read block ...passed 00:30:55.677 Test: blockdev write zeroes read block ...passed 00:30:55.677 Test: blockdev write zeroes read no split ...passed 00:30:55.938 Test: blockdev write zeroes read split ...passed 00:30:56.198 Test: blockdev write zeroes read split partial ...passed 00:30:56.198 Test: blockdev reset ...passed 00:30:56.198 Test: blockdev write read 8 blocks ...passed 00:30:56.198 Test: blockdev write read size > 128k ...passed 00:30:56.198 Test: blockdev write read invalid size ...passed 00:30:56.198 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:56.198 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:56.198 Test: blockdev write read max offset ...passed 00:30:56.198 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:56.198 Test: blockdev writev readv 8 blocks ...passed 00:30:56.198 Test: blockdev writev readv 30 x 1block ...passed 00:30:56.198 Test: blockdev writev readv block ...passed 00:30:56.198 Test: blockdev writev readv size > 128k ...passed 00:30:56.198 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:56.198 Test: blockdev comparev and writev ...passed 00:30:56.198 Test: blockdev nvme passthru rw ...passed 00:30:56.198 Test: blockdev nvme passthru vendor specific ...passed 00:30:56.198 Test: blockdev nvme admin passthru ...passed 00:30:56.198 Test: blockdev copy ...passed 00:30:56.198 00:30:56.198 Run Summary: Type Total Ran Passed Failed Inactive 00:30:56.198 suites 4 4 n/a 0 0 00:30:56.198 tests 92 92 92 0 0 00:30:56.198 asserts 520 520 520 0 n/a 00:30:56.198 00:30:56.198 Elapsed time = 1.847 seconds 00:30:56.198 0 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1813036 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1813036 ']' 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1813036 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1813036 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1813036' 00:30:56.198 killing process with pid 1813036 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1813036 00:30:56.198 08:06:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1813036 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:30:56.495 00:30:56.495 real 0m4.062s 00:30:56.495 user 0m10.825s 00:30:56.495 sys 0m0.480s 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:56.495 ************************************ 00:30:56.495 END TEST bdev_bounds 00:30:56.495 ************************************ 00:30:56.495 08:06:41 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:30:56.495 08:06:41 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:30:56.495 08:06:41 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:30:56.495 08:06:41 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:56.495 08:06:41 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:30:56.495 ************************************ 00:30:56.495 START TEST bdev_nbd 00:30:56.495 ************************************ 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1813677 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1813677 /var/tmp/spdk-nbd.sock 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1813677 ']' 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:30:56.495 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:56.495 08:06:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:56.495 [2024-07-15 08:06:41.186714] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:30:56.495 [2024-07-15 08:06:41.186758] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:56.755 [2024-07-15 08:06:41.273060] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:56.755 [2024-07-15 08:06:41.339293] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:56.755 [2024-07-15 08:06:41.360300] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:30:56.755 [2024-07-15 08:06:41.368322] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:56.755 [2024-07-15 08:06:41.376340] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:56.755 [2024-07-15 08:06:41.460532] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:30:59.293 [2024-07-15 08:06:43.609945] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:30:59.293 [2024-07-15 08:06:43.609997] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:59.294 [2024-07-15 08:06:43.610006] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.294 [2024-07-15 08:06:43.617962] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:30:59.294 [2024-07-15 08:06:43.617973] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:59.294 [2024-07-15 08:06:43.617979] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.294 [2024-07-15 08:06:43.625982] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:30:59.294 [2024-07-15 08:06:43.625991] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:59.294 [2024-07-15 08:06:43.625997] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.294 [2024-07-15 08:06:43.634001] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:30:59.294 [2024-07-15 08:06:43.634015] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:59.294 [2024-07-15 08:06:43.634020] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:59.294 1+0 records in 00:30:59.294 1+0 records out 00:30:59.294 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307975 s, 13.3 MB/s 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:59.294 08:06:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:59.553 1+0 records in 00:30:59.553 1+0 records out 00:30:59.553 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299316 s, 13.7 MB/s 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:59.553 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:59.812 1+0 records in 00:30:59.812 1+0 records out 00:30:59.812 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272449 s, 15.0 MB/s 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:59.812 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:00.072 1+0 records in 00:31:00.072 1+0 records out 00:31:00.072 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270166 s, 15.2 MB/s 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:00.072 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:31:00.331 { 00:31:00.331 "nbd_device": "/dev/nbd0", 00:31:00.331 "bdev_name": "crypto_ram" 00:31:00.331 }, 00:31:00.331 { 00:31:00.331 "nbd_device": "/dev/nbd1", 00:31:00.331 "bdev_name": "crypto_ram1" 00:31:00.331 }, 00:31:00.331 { 00:31:00.331 "nbd_device": "/dev/nbd2", 00:31:00.331 "bdev_name": "crypto_ram2" 00:31:00.331 }, 00:31:00.331 { 00:31:00.331 "nbd_device": "/dev/nbd3", 00:31:00.331 "bdev_name": "crypto_ram3" 00:31:00.331 } 00:31:00.331 ]' 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:31:00.331 { 00:31:00.331 "nbd_device": "/dev/nbd0", 00:31:00.331 "bdev_name": "crypto_ram" 00:31:00.331 }, 00:31:00.331 { 00:31:00.331 "nbd_device": "/dev/nbd1", 00:31:00.331 "bdev_name": "crypto_ram1" 00:31:00.331 }, 00:31:00.331 { 00:31:00.331 "nbd_device": "/dev/nbd2", 00:31:00.331 "bdev_name": "crypto_ram2" 00:31:00.331 }, 00:31:00.331 { 00:31:00.331 "nbd_device": "/dev/nbd3", 00:31:00.331 "bdev_name": "crypto_ram3" 00:31:00.331 } 00:31:00.331 ]' 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:00.331 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:00.332 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:00.332 08:06:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:00.332 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:00.590 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:00.591 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:01.161 08:06:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:01.421 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:31:01.681 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:01.682 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:31:01.941 /dev/nbd0 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:01.941 1+0 records in 00:31:01.941 1+0 records out 00:31:01.941 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274295 s, 14.9 MB/s 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:01.941 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:31:01.941 /dev/nbd1 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:02.202 1+0 records in 00:31:02.202 1+0 records out 00:31:02.202 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285157 s, 14.4 MB/s 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:31:02.202 /dev/nbd10 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:02.202 1+0 records in 00:31:02.202 1+0 records out 00:31:02.202 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026907 s, 15.2 MB/s 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:02.202 08:06:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:31:02.461 /dev/nbd11 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:02.461 1+0 records in 00:31:02.461 1+0 records out 00:31:02.461 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301685 s, 13.6 MB/s 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:02.461 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:02.462 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:02.721 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:31:02.721 { 00:31:02.721 "nbd_device": "/dev/nbd0", 00:31:02.721 "bdev_name": "crypto_ram" 00:31:02.721 }, 00:31:02.721 { 00:31:02.721 "nbd_device": "/dev/nbd1", 00:31:02.721 "bdev_name": "crypto_ram1" 00:31:02.721 }, 00:31:02.721 { 00:31:02.721 "nbd_device": "/dev/nbd10", 00:31:02.721 "bdev_name": "crypto_ram2" 00:31:02.721 }, 00:31:02.721 { 00:31:02.721 "nbd_device": "/dev/nbd11", 00:31:02.721 "bdev_name": "crypto_ram3" 00:31:02.721 } 00:31:02.721 ]' 00:31:02.721 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:31:02.721 { 00:31:02.721 "nbd_device": "/dev/nbd0", 00:31:02.721 "bdev_name": "crypto_ram" 00:31:02.721 }, 00:31:02.721 { 00:31:02.721 "nbd_device": "/dev/nbd1", 00:31:02.721 "bdev_name": "crypto_ram1" 00:31:02.721 }, 00:31:02.721 { 00:31:02.721 "nbd_device": "/dev/nbd10", 00:31:02.721 "bdev_name": "crypto_ram2" 00:31:02.721 }, 00:31:02.721 { 00:31:02.721 "nbd_device": "/dev/nbd11", 00:31:02.721 "bdev_name": "crypto_ram3" 00:31:02.722 } 00:31:02.722 ]' 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:31:02.722 /dev/nbd1 00:31:02.722 /dev/nbd10 00:31:02.722 /dev/nbd11' 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:31:02.722 /dev/nbd1 00:31:02.722 /dev/nbd10 00:31:02.722 /dev/nbd11' 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:31:02.722 256+0 records in 00:31:02.722 256+0 records out 00:31:02.722 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011788 s, 89.0 MB/s 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:02.722 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:31:02.982 256+0 records in 00:31:02.982 256+0 records out 00:31:02.982 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0656211 s, 16.0 MB/s 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:31:02.982 256+0 records in 00:31:02.982 256+0 records out 00:31:02.982 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0470223 s, 22.3 MB/s 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:31:02.982 256+0 records in 00:31:02.982 256+0 records out 00:31:02.982 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0392018 s, 26.7 MB/s 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:31:02.982 256+0 records in 00:31:02.982 256+0 records out 00:31:02.982 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0388982 s, 27.0 MB/s 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:02.982 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:03.241 08:06:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:03.501 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:03.761 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:31:04.021 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:31:04.281 malloc_lvol_verify 00:31:04.281 08:06:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:31:04.540 e0d337b1-445a-49bd-99b0-5c98bd6d78fb 00:31:04.540 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:31:04.799 db49b7fd-eda3-4af4-bbc4-e19015f255dc 00:31:04.799 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:31:04.799 /dev/nbd0 00:31:04.799 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:31:04.799 mke2fs 1.46.5 (30-Dec-2021) 00:31:04.799 Discarding device blocks: 0/4096 done 00:31:04.799 Creating filesystem with 4096 1k blocks and 1024 inodes 00:31:04.799 00:31:04.799 Allocating group tables: 0/1 done 00:31:04.799 Writing inode tables: 0/1 done 00:31:05.059 Creating journal (1024 blocks): done 00:31:05.059 Writing superblocks and filesystem accounting information: 0/1 done 00:31:05.059 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1813677 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1813677 ']' 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1813677 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:05.059 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1813677 00:31:05.319 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:05.319 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:05.319 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1813677' 00:31:05.319 killing process with pid 1813677 00:31:05.319 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1813677 00:31:05.319 08:06:49 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1813677 00:31:05.319 08:06:50 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:31:05.319 00:31:05.319 real 0m8.943s 00:31:05.319 user 0m12.484s 00:31:05.319 sys 0m2.507s 00:31:05.319 08:06:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:05.319 08:06:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:05.319 ************************************ 00:31:05.319 END TEST bdev_nbd 00:31:05.319 ************************************ 00:31:05.579 08:06:50 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:31:05.579 08:06:50 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:31:05.579 08:06:50 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:31:05.579 08:06:50 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:31:05.579 08:06:50 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:31:05.579 08:06:50 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:05.579 08:06:50 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:05.580 08:06:50 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:05.580 ************************************ 00:31:05.580 START TEST bdev_fio 00:31:05.580 ************************************ 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:05.580 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:05.580 ************************************ 00:31:05.580 START TEST bdev_fio_rw_verify 00:31:05.580 ************************************ 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:05.580 08:06:50 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:06.148 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:06.148 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:06.148 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:06.148 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:06.148 fio-3.35 00:31:06.148 Starting 4 threads 00:31:21.037 00:31:21.037 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1815863: Mon Jul 15 08:07:03 2024 00:31:21.037 read: IOPS=36.1k, BW=141MiB/s (148MB/s)(1409MiB/10001msec) 00:31:21.037 slat (usec): min=11, max=530, avg=36.09, stdev=24.95 00:31:21.037 clat (usec): min=13, max=1304, avg=216.54, stdev=155.96 00:31:21.037 lat (usec): min=26, max=1421, avg=252.62, stdev=170.18 00:31:21.037 clat percentiles (usec): 00:31:21.037 | 50.000th=[ 165], 99.000th=[ 750], 99.900th=[ 963], 99.990th=[ 1156], 00:31:21.037 | 99.999th=[ 1270] 00:31:21.037 write: IOPS=39.5k, BW=154MiB/s (162MB/s)(1507MiB/9767msec); 0 zone resets 00:31:21.037 slat (usec): min=11, max=745, avg=45.83, stdev=25.09 00:31:21.037 clat (usec): min=15, max=1545, avg=248.18, stdev=160.31 00:31:21.037 lat (usec): min=37, max=1704, avg=294.02, stdev=174.85 00:31:21.037 clat percentiles (usec): 00:31:21.037 | 50.000th=[ 206], 99.000th=[ 775], 99.900th=[ 988], 99.990th=[ 1188], 00:31:21.037 | 99.999th=[ 1352] 00:31:21.037 bw ( KiB/s): min=126640, max=185440, per=98.24%, avg=155214.74, stdev=6076.27, samples=76 00:31:21.037 iops : min=31660, max=46360, avg=38803.63, stdev=1519.07, samples=76 00:31:21.037 lat (usec) : 20=0.01%, 50=0.88%, 100=16.13%, 250=49.41%, 500=25.94% 00:31:21.037 lat (usec) : 750=6.48%, 1000=1.08% 00:31:21.037 lat (msec) : 2=0.08% 00:31:21.037 cpu : usr=99.72%, sys=0.00%, ctx=66, majf=0, minf=268 00:31:21.037 IO depths : 1=0.1%, 2=28.6%, 4=57.1%, 8=14.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:21.037 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:21.037 complete : 0=0.0%, 4=87.5%, 8=12.5%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:21.037 issued rwts: total=360737,385774,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:21.037 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:21.037 00:31:21.037 Run status group 0 (all jobs): 00:31:21.037 READ: bw=141MiB/s (148MB/s), 141MiB/s-141MiB/s (148MB/s-148MB/s), io=1409MiB (1478MB), run=10001-10001msec 00:31:21.037 WRITE: bw=154MiB/s (162MB/s), 154MiB/s-154MiB/s (162MB/s-162MB/s), io=1507MiB (1580MB), run=9767-9767msec 00:31:21.037 00:31:21.037 real 0m13.408s 00:31:21.037 user 0m48.755s 00:31:21.037 sys 0m0.434s 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:21.037 ************************************ 00:31:21.037 END TEST bdev_fio_rw_verify 00:31:21.037 ************************************ 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:31:21.037 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4221b852-00ac-55c6-bdf3-ddd70841495a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4221b852-00ac-55c6-bdf3-ddd70841495a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d8b9dd12-a0f8-5ddc-b772-617ad2d57c58"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d8b9dd12-a0f8-5ddc-b772-617ad2d57c58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "97427d56-fcb2-5124-ac82-246d955dab31"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "97427d56-fcb2-5124-ac82-246d955dab31",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "23302043-cdc2-5a25-9f6d-d52d6e7e4273"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "23302043-cdc2-5a25-9f6d-d52d6e7e4273",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:31:21.038 crypto_ram1 00:31:21.038 crypto_ram2 00:31:21.038 crypto_ram3 ]] 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "4221b852-00ac-55c6-bdf3-ddd70841495a"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "4221b852-00ac-55c6-bdf3-ddd70841495a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d8b9dd12-a0f8-5ddc-b772-617ad2d57c58"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d8b9dd12-a0f8-5ddc-b772-617ad2d57c58",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "97427d56-fcb2-5124-ac82-246d955dab31"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "97427d56-fcb2-5124-ac82-246d955dab31",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "23302043-cdc2-5a25-9f6d-d52d6e7e4273"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "23302043-cdc2-5a25-9f6d-d52d6e7e4273",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:21.038 ************************************ 00:31:21.038 START TEST bdev_fio_trim 00:31:21.038 ************************************ 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:21.038 08:07:03 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:21.038 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:21.038 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:21.038 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:21.038 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:21.038 fio-3.35 00:31:21.038 Starting 4 threads 00:31:33.287 00:31:33.287 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1818220: Mon Jul 15 08:07:16 2024 00:31:33.287 write: IOPS=59.2k, BW=231MiB/s (243MB/s)(2314MiB/10001msec); 0 zone resets 00:31:33.287 slat (usec): min=14, max=804, avg=41.66, stdev=27.39 00:31:33.287 clat (usec): min=17, max=1093, avg=143.01, stdev=84.24 00:31:33.287 lat (usec): min=33, max=1189, avg=184.67, stdev=98.23 00:31:33.287 clat percentiles (usec): 00:31:33.287 | 50.000th=[ 126], 99.000th=[ 424], 99.900th=[ 537], 99.990th=[ 619], 00:31:33.287 | 99.999th=[ 742] 00:31:33.287 bw ( KiB/s): min=215840, max=243168, per=99.98%, avg=236911.11, stdev=1770.68, samples=76 00:31:33.287 iops : min=53960, max=60792, avg=59227.74, stdev=442.68, samples=76 00:31:33.287 trim: IOPS=59.2k, BW=231MiB/s (243MB/s)(2314MiB/10001msec); 0 zone resets 00:31:33.287 slat (usec): min=4, max=173, avg= 8.64, stdev= 4.02 00:31:33.287 clat (usec): min=34, max=1189, avg=184.83, stdev=98.23 00:31:33.287 lat (usec): min=40, max=1195, avg=193.47, stdev=98.94 00:31:33.287 clat percentiles (usec): 00:31:33.287 | 50.000th=[ 161], 99.000th=[ 506], 99.900th=[ 635], 99.990th=[ 725], 00:31:33.287 | 99.999th=[ 873] 00:31:33.287 bw ( KiB/s): min=215840, max=243168, per=99.98%, avg=236911.11, stdev=1770.68, samples=76 00:31:33.287 iops : min=53960, max=60794, avg=59227.74, stdev=442.68, samples=76 00:31:33.287 lat (usec) : 20=0.01%, 50=3.80%, 100=22.14%, 250=58.61%, 500=14.78% 00:31:33.287 lat (usec) : 750=0.66%, 1000=0.01% 00:31:33.287 lat (msec) : 2=0.01% 00:31:33.287 cpu : usr=99.71%, sys=0.00%, ctx=75, majf=0, minf=104 00:31:33.287 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:33.287 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:33.287 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:33.287 issued rwts: total=0,592427,592427,0 short=0,0,0,0 dropped=0,0,0,0 00:31:33.287 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:33.287 00:31:33.287 Run status group 0 (all jobs): 00:31:33.287 WRITE: bw=231MiB/s (243MB/s), 231MiB/s-231MiB/s (243MB/s-243MB/s), io=2314MiB (2427MB), run=10001-10001msec 00:31:33.287 TRIM: bw=231MiB/s (243MB/s), 231MiB/s-231MiB/s (243MB/s-243MB/s), io=2314MiB (2427MB), run=10001-10001msec 00:31:33.287 00:31:33.287 real 0m13.438s 00:31:33.287 user 0m49.483s 00:31:33.287 sys 0m0.421s 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:31:33.287 ************************************ 00:31:33.287 END TEST bdev_fio_trim 00:31:33.287 ************************************ 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:31:33.287 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:31:33.287 00:31:33.287 real 0m27.183s 00:31:33.287 user 1m38.423s 00:31:33.287 sys 0m1.024s 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:33.287 ************************************ 00:31:33.287 END TEST bdev_fio 00:31:33.287 ************************************ 00:31:33.287 08:07:17 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:31:33.287 08:07:17 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:33.287 08:07:17 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:33.287 08:07:17 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:33.287 08:07:17 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:33.287 08:07:17 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:33.287 ************************************ 00:31:33.287 START TEST bdev_verify 00:31:33.287 ************************************ 00:31:33.287 08:07:17 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:33.287 [2024-07-15 08:07:17.450796] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:31:33.287 [2024-07-15 08:07:17.450838] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1820009 ] 00:31:33.287 [2024-07-15 08:07:17.536427] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:33.287 [2024-07-15 08:07:17.602532] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:33.287 [2024-07-15 08:07:17.602536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:33.288 [2024-07-15 08:07:17.623609] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:31:33.288 [2024-07-15 08:07:17.631639] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:33.288 [2024-07-15 08:07:17.639661] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:33.288 [2024-07-15 08:07:17.728104] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:35.198 [2024-07-15 08:07:19.879633] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:35.198 [2024-07-15 08:07:19.879695] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:35.198 [2024-07-15 08:07:19.879704] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:35.198 [2024-07-15 08:07:19.887648] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:35.198 [2024-07-15 08:07:19.887658] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:35.198 [2024-07-15 08:07:19.887664] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:35.198 [2024-07-15 08:07:19.895668] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:35.198 [2024-07-15 08:07:19.895678] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:35.198 [2024-07-15 08:07:19.895683] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:35.198 [2024-07-15 08:07:19.903689] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:35.198 [2024-07-15 08:07:19.903700] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:35.198 [2024-07-15 08:07:19.903705] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:35.457 Running I/O for 5 seconds... 00:31:40.742 00:31:40.742 Latency(us) 00:31:40.742 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:40.742 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:40.742 Verification LBA range: start 0x0 length 0x1000 00:31:40.742 crypto_ram : 5.05 608.21 2.38 0.00 0.00 210033.08 11040.30 126635.72 00:31:40.742 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:40.742 Verification LBA range: start 0x1000 length 0x1000 00:31:40.742 crypto_ram : 5.06 506.02 1.98 0.00 0.00 251883.93 4209.43 146800.64 00:31:40.742 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:40.742 Verification LBA range: start 0x0 length 0x1000 00:31:40.742 crypto_ram1 : 5.05 608.02 2.38 0.00 0.00 209587.81 11998.13 118569.75 00:31:40.742 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:40.742 Verification LBA range: start 0x1000 length 0x1000 00:31:40.742 crypto_ram1 : 5.06 505.90 1.98 0.00 0.00 251055.38 4562.31 140347.86 00:31:40.742 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:40.742 Verification LBA range: start 0x0 length 0x1000 00:31:40.742 crypto_ram2 : 5.04 4776.49 18.66 0.00 0.00 26581.06 6049.48 22786.36 00:31:40.742 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:40.742 Verification LBA range: start 0x1000 length 0x1000 00:31:40.742 crypto_ram2 : 5.05 3928.68 15.35 0.00 0.00 32217.26 3012.14 26012.75 00:31:40.742 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:40.742 Verification LBA range: start 0x0 length 0x1000 00:31:40.742 crypto_ram3 : 5.05 4792.84 18.72 0.00 0.00 26445.61 1398.94 23189.66 00:31:40.742 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:40.742 Verification LBA range: start 0x1000 length 0x1000 00:31:40.742 crypto_ram3 : 5.02 3903.95 15.25 0.00 0.00 32675.17 6452.78 50412.31 00:31:40.742 =================================================================================================================== 00:31:40.742 Total : 19630.12 76.68 0.00 0.00 51899.25 1398.94 146800.64 00:31:40.742 00:31:40.742 real 0m7.903s 00:31:40.742 user 0m15.218s 00:31:40.742 sys 0m0.254s 00:31:40.742 08:07:25 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:40.742 08:07:25 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:31:40.742 ************************************ 00:31:40.742 END TEST bdev_verify 00:31:40.742 ************************************ 00:31:40.742 08:07:25 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:31:40.742 08:07:25 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:40.742 08:07:25 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:40.742 08:07:25 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:40.742 08:07:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:40.742 ************************************ 00:31:40.742 START TEST bdev_verify_big_io 00:31:40.742 ************************************ 00:31:40.742 08:07:25 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:40.742 [2024-07-15 08:07:25.431140] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:31:40.742 [2024-07-15 08:07:25.431190] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1821252 ] 00:31:41.002 [2024-07-15 08:07:25.522155] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:41.002 [2024-07-15 08:07:25.598581] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:41.002 [2024-07-15 08:07:25.598586] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:41.002 [2024-07-15 08:07:25.619684] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:31:41.002 [2024-07-15 08:07:25.627713] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:41.002 [2024-07-15 08:07:25.635737] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:41.002 [2024-07-15 08:07:25.726278] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:43.541 [2024-07-15 08:07:27.885662] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:43.541 [2024-07-15 08:07:27.885724] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:43.541 [2024-07-15 08:07:27.885733] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:43.541 [2024-07-15 08:07:27.893676] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:43.541 [2024-07-15 08:07:27.893687] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:43.541 [2024-07-15 08:07:27.893693] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:43.541 [2024-07-15 08:07:27.901697] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:43.541 [2024-07-15 08:07:27.901708] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:43.541 [2024-07-15 08:07:27.901717] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:43.541 [2024-07-15 08:07:27.909724] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:43.541 [2024-07-15 08:07:27.909734] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:43.541 [2024-07-15 08:07:27.909739] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:43.541 Running I/O for 5 seconds... 00:31:44.112 [2024-07-15 08:07:28.735770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.736197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.736252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.736293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.736331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.736368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.736750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.736763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.739999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.740040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.740078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.740127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.740604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.740642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.740678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.740717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.741104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.741114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.744158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.744197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.744235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.744286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.744716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.744754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.744794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.744835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.112 [2024-07-15 08:07:28.745199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.745209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.748114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.748153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.748189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.748224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.748632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.748669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.748708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.748748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.749157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.749167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.752105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.752144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.752179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.752217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.752637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.752675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.752714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.752751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.753093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.753103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.756177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.756216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.756251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.756290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.756714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.756753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.756789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.756828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.757267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.757277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.760268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.760307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.760344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.760379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.760785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.760823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.760858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.760893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.761293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.761303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.764180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.764219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.764255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.764291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.764687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.764728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.764765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.764800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.765193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.765203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.768180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.768220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.768255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.768293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.768770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.768811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.768847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.768883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.769267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.769276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.772483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.772536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.772572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.772608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.773096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.773135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.773172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.773209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.773594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.773604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.776578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.776617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.776654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.776698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.777145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.777185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.777220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.777258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.777714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.777724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.780606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.780645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.780680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.780719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.781144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.781194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.781230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.781265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.781725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.781735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.784480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.784519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.784555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.784596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.784963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.785002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.785037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.785072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.785376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.785386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.787538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.113 [2024-07-15 08:07:28.787577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.787616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.787651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.788130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.788167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.788203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.788238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.788631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.788641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.791091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.791129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.791169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.791205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.791650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.791687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.791726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.791761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.792064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.792078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.794607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.794645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.794681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.794719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.795041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.795079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.795115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.795151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.795422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.795432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.797533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.797586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.797622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.797661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.798167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.798205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.798241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.798278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.798669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.798679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.801230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.801269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.801305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.801342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.801703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.801743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.801779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.801815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.802086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.802096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.804516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.804555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.804591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.804626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.804973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.805011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.805046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.805082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.805377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.805386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.807449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.807487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.807522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.807564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.808023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.808061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.808096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.808131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.808515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.808526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.810870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.810909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.810945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.810983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.811374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.811411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.811447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.811482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.811797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.811806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.814207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.814247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.814284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.814320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.814739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.814777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.814813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.814848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.815141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.815151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.817078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.817124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.817159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.817195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.817545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.817582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.817618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.817655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.818132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.818142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.820475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.114 [2024-07-15 08:07:28.820513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.820551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.820587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.820893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.820932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.820969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.821005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.821274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.821284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.823596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.823639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.823674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.823716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.824179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.824218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.824254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.824290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.824593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.824604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.826698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.826742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.826779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.826814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.827115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.827153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.827189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.827225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.827620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.827629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.829990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.830919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.832970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.833013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.833049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.833085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.833554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.833591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.833627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.833662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.834053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.834063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.835918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.835957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.835993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.836029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.836348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.836387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.836422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.836458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.836732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.836742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.839473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.839512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.839547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.839582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.839916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.839955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.839990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.840026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.840300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.840310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.842143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.842182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.842221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.842257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.842770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.842809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.842845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.842880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.843258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.843268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.845374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.845414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.845450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.845475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.845928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.845966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.846001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.846036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.846337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.846347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.849036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.850452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.851941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.853650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.854868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.856350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.858059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.859772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.860141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.860152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.115 [2024-07-15 08:07:28.864370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.116 [2024-07-15 08:07:28.866200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.867950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.869220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.871263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.872972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.873821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.874196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.874571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.874581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.877499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.878982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.880674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.882383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.883079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.883454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.883832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.884632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.884933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.884943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.888405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.889760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.890137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.890511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.891493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.892983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.894692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.896405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.896734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.896744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.898954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.899332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.899882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.901384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.903364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.904740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.906397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.907998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.908272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.908281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.911932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.913409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.915102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.916808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.918763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.920457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.922156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.923436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.923901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.923911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.927919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.929675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.930940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.932422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.379 [2024-07-15 08:07:28.934400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.935143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.935517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.935893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.936271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.936282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.939516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.941234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.942950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.943824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.944680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.945059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.946169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.947648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.947926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.947937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.950820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.951209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.951583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.951960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.953742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.955433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.957137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.958091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.958446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.958456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.960854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.961540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.963020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.964377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.966445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.967701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.969179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.970886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.971159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.971168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.975278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.976989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.978697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.979810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.981721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.983497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.985347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.985724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.986117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.986127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.989957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.990917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.992402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.994108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.995438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.995822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.996197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.996574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.996865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:28.996876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.000335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.002048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.003024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.003398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.004133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.005359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.006842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.008544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.008820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.008831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.011068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.011444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.011821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.013131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.015207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.016917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.017866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.019350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.019622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.019632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.022033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.023611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.025318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.027024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.028455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.029931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.031627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.033334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.033814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.033824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.037788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.039403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.040508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.041990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.043377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.043756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.044130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.044507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.044855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.044866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.380 [2024-07-15 08:07:29.047598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.047978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.048353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.048731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.049488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.049872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.050247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.050619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.051016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.051027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.053652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.054036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.054410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.054786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.055647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.056025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.056400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.056778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.057191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.057201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.059965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.060342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.060720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.061094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.061847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.062225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.062602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.062978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.063367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.063379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.066124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.066516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.066897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.067272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.068120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.068496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.068876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.069255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.069740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.069751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.072293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.072669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.073045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.073419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.074153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.074529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.074908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.075281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.075747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.075758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.078407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.078799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.079172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.079545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.080370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.080754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.081127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.081500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.081849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.081859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.084437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.084820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.085195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.085569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.086331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.086731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.087111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.087485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.087887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.087898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.090474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.090854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.091237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.091613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.092440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.092822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.093196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.093572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.094020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.094031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.096678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.097059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.097435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.097814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.098586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.098965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.099339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.099738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.100118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.100129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.102792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.103168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.103542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.103921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.104845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.105222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.105597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.105977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.381 [2024-07-15 08:07:29.106379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.106390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.108853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.109238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.109621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.109999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.110774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.111149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.111522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.111964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.112237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.112251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.114798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.116465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.118059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.119787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.121060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.122542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.124249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.125967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.126393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.126403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.130213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.382 [2024-07-15 08:07:29.131939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.133788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.134981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.136992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.138714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.139589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.139970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.140384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.140395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.143370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.145163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.146827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.148617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.149265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.149643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.150021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.150804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.151097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.151108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.154490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.156188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.156564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.156594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.157382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.157761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.159375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.161122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.161394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.161404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.164029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.164409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.164787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.165160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.165197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.165473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.167069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.168794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.170651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.171858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.172173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.172183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.174810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.175104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.175114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.176734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.176773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.176808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.176843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.177115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.177155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.177191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.177227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.177263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.177531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.177541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.179912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.179951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.179987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.180022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.180322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.180366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.180402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.180438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.180473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.180746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.180757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.182340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.182379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.182416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.182452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.182949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.183004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.183040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.647 [2024-07-15 08:07:29.183075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.183111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.183550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.183560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.185477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.185517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.185552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.185587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.185858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.185903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.185939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.185975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.186013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.186339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.186348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.188771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.189175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.189185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.190787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.190825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.190860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.190895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.191200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.191241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.191277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.191313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.191348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.191616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.191626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.193640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.193679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.193722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.193758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.194055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.194095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.194131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.194167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.194203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.194471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.194481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.196609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.197134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.197144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.199965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.201570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.201608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.201643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.201678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.202083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.202125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.202161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.202196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.202235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.202702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.202721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.204967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.205233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.205243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.207479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.207517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.207553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.207588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.208009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.208056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.208092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.208133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.648 [2024-07-15 08:07:29.208169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.208508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.208518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.210778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.211103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.211113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.213461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.213506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.213542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.213577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.213850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.213890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.213932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.213968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.214004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.214272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.214282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.215919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.215958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.216000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.216036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.216521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.216561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.216597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.216632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.216667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.217044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.217055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.219765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.220033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.220043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.221883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.221921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.221957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.221992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.222449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.222489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.222526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.222561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.222597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.222968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.222978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.224581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.224619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.224654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.224689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.225007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.225048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.225084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.225120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.225162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.225429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.225438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.227462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.227504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.227545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.227581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.227854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.227895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.227931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.227967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.228002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.228270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.228280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.229921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.229959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.229994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.230044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.230316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.230360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.230397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.230433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.230468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.230963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.230974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.232885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.232924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.232964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.233000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.233268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.649 [2024-07-15 08:07:29.233309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.233345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.233381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.233416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.233834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.233847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.235467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.235505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.235540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.235575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.235952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.235994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.236029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.236074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.236110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.236591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.236601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.238799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.239067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.239077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.241743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.242083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.242093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.243830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.243868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.243904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.243943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.244215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.244256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.244292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.244327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.244362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.244778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.244788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.247858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.248126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.248135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.249864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.249902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.249942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.249977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.250424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.250468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.250504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.250539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.250575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.250981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.250993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.253968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.255827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.255865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.255901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.255936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.256400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.256441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.256477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.256512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.256548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.256927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.256937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.258556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.258594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.260066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.260104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.650 [2024-07-15 08:07:29.260372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.260413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.260449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.260485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.260520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.260830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.260840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.263270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.263308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.263343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.264825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.265099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.265140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.265176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.265227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.265263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.265558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.265568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.267538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.267919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.268293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.269870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.270185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.271901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.273611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.274573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.276049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.276320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.276330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.278585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.280261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.282100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.283834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.284108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.285342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.286820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.288524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.290230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.290661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.290671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.294480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.296276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.298078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.299310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.299634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.301328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.303037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.303888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.304261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.304658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.304668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.307746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.309380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.310904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.312610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.312886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.313266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.313641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.314018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.314719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.315032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.315041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.318469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.320183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.320945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.321320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.321731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.322110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.323633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.325110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.326817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.327088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.327098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.329160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.329536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.329913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.330631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.330905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.332625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.651 [2024-07-15 08:07:29.333739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.335006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.336737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.337008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.337018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.339704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.340084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.340458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.340843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.341235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.341614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.341995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.342369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.342762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.343129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.343138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.345687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.346071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.346446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.346821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.347299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.347680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.348059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.348433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.348812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.349318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.349329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.351814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.352190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.352565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.352941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.353328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.353708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.354085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.354459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.354836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.355215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.355225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.357849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.358225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.358599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.358976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.359454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.359837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.360211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.360584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.360961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.361422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.361433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.363922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.364300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.364674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.365050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.365419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.365801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.366175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.366548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.366925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.367277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.367288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.370044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.370424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.370802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.371175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.371564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.371945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.372320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.372693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.373073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.373454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.373465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.375979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.376360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.376737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.377111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.377486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.377867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.378242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.378616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.378994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.379375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.379386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.381918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.382294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.382668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.383046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.383469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.383850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.384226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.384600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.384978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.385388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.385398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.387912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.388289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.388665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.389049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.389491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.389871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.390248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.652 [2024-07-15 08:07:29.390624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.391004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.391443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.391458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.394001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.394382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.394760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.395136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.395579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.395960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.396336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.396714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.397093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.397613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.653 [2024-07-15 08:07:29.397625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.400126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.400505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.400884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.401259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.401668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.402055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.402430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.402810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.403184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.403453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.403464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.406056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.406435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.406813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.407189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.407557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.407941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.409033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.410510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.412212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.412484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.412494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.414619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.414999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.415374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.415846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.416136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.417963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.419731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.421375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.422724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.423094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.423104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.425360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.426112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.427595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.429305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.429578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.430805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.432618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.434293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.436099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.436373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.436383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.440315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.442041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.443753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.444865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.445137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.446763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.448515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.450362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.450742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.451129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.451139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.454766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.455840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.457531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.459382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.459653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.461490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.920 [2024-07-15 08:07:29.461874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.462247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.462632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.463019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.463030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.466113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.467818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.469528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.470241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.470720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.471099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.471475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.472844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.474323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.474594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.474604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.478050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.478431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.478813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.479199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.479564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.481043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.482738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.484445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.485561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.485838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.485849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.487983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.488361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.489837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.491313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.491584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.493300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.494273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.495756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.497461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.497738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.497748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.501587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.503172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.504889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.506722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.507089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.508583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.510286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.511991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.512955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.513414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.513424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.517040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.518790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.519914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.521394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.521667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.523377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.524220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.524595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.524974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.525392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.525403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.528726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.530569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.532335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.533958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.534369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.534751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.535127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.535635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.537122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.537392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.537403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.540678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.541544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.541926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.542301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.542687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.544167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.545642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.547346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.549060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.549540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.549551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.551598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.551979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.552601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.554079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.554350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.556066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.557424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.559112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.560687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.560965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.560976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.563909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.565388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.565426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.567123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.567395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.568367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.921 [2024-07-15 08:07:29.569844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.571535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.573241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.573571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.573582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.577340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.579057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.580777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.580814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.581195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.582737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.584455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.586160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.587465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.587973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.587984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.589926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.589965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.590926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.592536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.592575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.592611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.592646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.593031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.593071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.593107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.593142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.593178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.593690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.593700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.595922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.596190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.596200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.598931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.599199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.599209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.600875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.600914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.600949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.600984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.601251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.601295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.601332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.601367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.601404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.601674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.601684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.604965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.606555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.606592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.606628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.606664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.607145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.607193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.607229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.607263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.607299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.607725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.607736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.609627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.609666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.609705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.609745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.922 [2024-07-15 08:07:29.610015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.610055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.610091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.610127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.610162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.610504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.610518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.612893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.613262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.613272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.614880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.614918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.614954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.614989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.615290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.615331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.615367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.615402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.615437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.615705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.615719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.617681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.617724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.617761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.617796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.618123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.618163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.618199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.618237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.618272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.618606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.618621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.620734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.621183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.621193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.623849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.624118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.624128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.625829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.625868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.625904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.625939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.626416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.626459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.626495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.626531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.626566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.626936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.626947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.628725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.628771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.628806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.628841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.629219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.629280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.629316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.629351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.629386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.629722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.629733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.631638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.631676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.631718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.631755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.632190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.632232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.632267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.632302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.632339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.632683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.632693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.634327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.634366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.634401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.634440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.923 [2024-07-15 08:07:29.634716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.634765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.634802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.634838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.634873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.635140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.635150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.637945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.638213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.638223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.639873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.639912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.639948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.639983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.640259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.640298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.640335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.640371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.640406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.640866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.640876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.642801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.642840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.642883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.642919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.643188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.643227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.643263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.643298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.643334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.643731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.643742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.645979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.646460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.646471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.924 [2024-07-15 08:07:29.648102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.648915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.650984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.651960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.653551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.653589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.653628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.653663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.654116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.654171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.654207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.654242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.654277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.654703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.654719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.656601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.656639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.656674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.656713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.656983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.657027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.657067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.657103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.657140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.657659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.657669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.659368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.659406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.659442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.659477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.659876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.659918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.659953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.659989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.660036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.660495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.660506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.662644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.662683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.662722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.662758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.663162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.663201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.663237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.663273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.663309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.663843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.663852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.666034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.666073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.666112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.666149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.666637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.666678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.666718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.925 [2024-07-15 08:07:29.666757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.666793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.667258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.667268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.669335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.669374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.669410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.669445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.669887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.669931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.669970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.670005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:44.926 [2024-07-15 08:07:29.670042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.670415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.670427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.672788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.672828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.672864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.672899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.673249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.673289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.673324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.673360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.673396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.673793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.673804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.675997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.676039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.676413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.676466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.676877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.676919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.676955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.676991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.677048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.677469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.677480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.679568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.679607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.679643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.680020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.680416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.680457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.680493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.680539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.680575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.681052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.681063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.683648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.684032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.684407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.684790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.685166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.685544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.685923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.686296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.686670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.687059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.687074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.689717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.690095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.690470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.690847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.691323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.691701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.692093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.692465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.692843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.693338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.693349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.695771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.696150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.696523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.696900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.697318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.697696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.698074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.698449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.698827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.699312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.699323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.702184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.702561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.702938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.703312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.703691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.704073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.704449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.704826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.705207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.705605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.705615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.708237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.708635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.709012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.709385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.709849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.190 [2024-07-15 08:07:29.710228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.710623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.711001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.711375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.711866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.711877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.714427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.714807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.715184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.715557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.715965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.716345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.716723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.717097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.717472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.717876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.717888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.720616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.720999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.721374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.721751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.722197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.722579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.722958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.723332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.723704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.724185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.724195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.726653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.727038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.727413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.727791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.728232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.728612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.728991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.729364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.729744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.730083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.730092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.732627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.733013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.733388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.733766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.734177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.734557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.735339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.736820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.738527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.738802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.738812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.741126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.741504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.741881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.742259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.742530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.744216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.746022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.747815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.749057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.749381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.749391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.751670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.752435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.753910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.755620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.755897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.757109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.758944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.760626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.762418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.762689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.762700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.766566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.768275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.769994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.771327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.771601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.773084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.774785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.776499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.776881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.777282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.777292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.780966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.781992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.783640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.785429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.785704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.787440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.787820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.788194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.788583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.788942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.788952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.792037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.793746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.795456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.796260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.796797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.797175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.191 [2024-07-15 08:07:29.797550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.798782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.800266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.800537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.800547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.803989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.804368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.804747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.805134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.805509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.807114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.808845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.810555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.812032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.812376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.812386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.814465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.814852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.816029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.817512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.817791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.819512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.820476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.821949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.823659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.823937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.823947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.827602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.829088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.830803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.832517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.833031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.834520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.836224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.837944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.839031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.839534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.839544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.843140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.844862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.845811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.847289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.847565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.849279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.850320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.850696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.851075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.851507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.851518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.854800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.856390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.858103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.859937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.860311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.860689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.861067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.861442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.863161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.863431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.863441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.866707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.867958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.868335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.868712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.869203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.869998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.871488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.873195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.874904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.875390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.875400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.877394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.877776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.878151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.879920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.880197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.881930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.883775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.884947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.886430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.886704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.886717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.888980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.890539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.892229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.893936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.894257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.895672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.897162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.898854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.900560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.901006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.901016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.904864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.906580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.908066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.909621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.909930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.192 [2024-07-15 08:07:29.911642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.913346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.913832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.914206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.914613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.914624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.917423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.919158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.921006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.922778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.923050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.923429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.923808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.924182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.924961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.925265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.925275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.928701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.930418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.930966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.931341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.931729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.932108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.933689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.935176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.936874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.937146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.937156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.939194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.939570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.939953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.940897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.193 [2024-07-15 08:07:29.941241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.942954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.944663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.945784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.947558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.947834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.947848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.950437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.951716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.953196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.954901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.955176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.956128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.957607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.959325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.961034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.961366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.961376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.965299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.458 [2024-07-15 08:07:29.967010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.968719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.969685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.969998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.971827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.973535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.975094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.975473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.975867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.975877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.979724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.980691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.982170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.983862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.984135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.985372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.985753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.986126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.986501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.986824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.986835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.989939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.991652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.993363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.993914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.994380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.994761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.995136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.996676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.998151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.998421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:29.998432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.001448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.001835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.001875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.002693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.003179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.004066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.005544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.007239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.008933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.009341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.009352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.011310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.011688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.012066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.012103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.012375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.014066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.015861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.017675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.018918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.019252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.019262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.021905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.022190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.022200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.023805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.023844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.023879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.023915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.024183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.024223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.024259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.024295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.024331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.024598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.024608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.027852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.029443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.029482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.029518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.029553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.030002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.030054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.030091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.030126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.030162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.030588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.459 [2024-07-15 08:07:30.030598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.032470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.032509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.032545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.032580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.032998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.033045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.033081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.033117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.033153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.033425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.033435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.035378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.035417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.035453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.035489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.035889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.035931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.035968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.036003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.036039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.036337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.036347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.038994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.039004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.041313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.041353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.041392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.041428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.041859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.041902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.041942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.041979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.042014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.042484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.042496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.044661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.044700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.044742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.044777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.045157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.045203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.045240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.045278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.045313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.045777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.045787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.047906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.047945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.047994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.048031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.048500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.048541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.048577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.048612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.048648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.049075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.049085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.051860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.052292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.052301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.054776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.054814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.054853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.054890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.055366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.055406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.055442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.055488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.055524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.056016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.056027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.058865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.059298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.460 [2024-07-15 08:07:30.059308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.061544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.061583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.061621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.061656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.062154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.062196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.062233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.062268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.062304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.062686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.062696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.064965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.065003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.065039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.065074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.065496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.065537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.065572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.065610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.065646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.066081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.066092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.068504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.068542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.068581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.068617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.069084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.069125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.069173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.069209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.069245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.069736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.069747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.071873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.071912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.071950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.071986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.072363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.072408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.072445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.072481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.072517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.072992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.073003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.075991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.076359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.076369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.078567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.078606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.078642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.078677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.079049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.079091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.079126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.079173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.079209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.079630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.079644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.081852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.081890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.081930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.081965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.082338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.082378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.082415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.082450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.082485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.082838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.082849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.084997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.085036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.085072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.085107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.085605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.085650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.085696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.085736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.085772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.086042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.086052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.088237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.088279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.088316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.088352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.088849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.088892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.088928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.088964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.089003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.089275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.461 [2024-07-15 08:07:30.089285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.091731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.091770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.091806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.091841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.092202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.092244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.092295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.092331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.092367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.092688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.092699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.094851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.094890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.094931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.094967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.095340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.095381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.095417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.095454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.095489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.095864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.095875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.097861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.097900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.097937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.097973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.098307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.098354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.098391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.098426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.098462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.098880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.098890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.100783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.100821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.100857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.100893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.101303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.101348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.101409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.101445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.101480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.101857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.101867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.103729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.103768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.103803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.103838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.104209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.104249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.104285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.104355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.104392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.104812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.104822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.106680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.106721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.106761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.106800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.107211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.107251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.107287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.107337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.107373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.107832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.107842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.109879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.109918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.109954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.109989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.110402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.110445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.110482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.110517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.110555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.110917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.110928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.112898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.112935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.112970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.113022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.113523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.113562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.113598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.113633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.113668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.114018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.114028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.116364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.116403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.117345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.117382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.117755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.462 [2024-07-15 08:07:30.117796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.117831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.117883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.117918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.118418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.118428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.120447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.120486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.120521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.120898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.121275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.121316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.121352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.121387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.121450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.121847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.121857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.124171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.124553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.125765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.126509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.126896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.128397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.128855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.129229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.131061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.131465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.131475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.134569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.134950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.135324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.135698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.136182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.136561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.136939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.137312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.137686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.138177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.138188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.141936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.143642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.144975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.146452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.146724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.148452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.148974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.149349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.149725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.150120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.150130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.153230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.154947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.156658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.157891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.158358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.158741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.159129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.159931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.161417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.161687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.161698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.165080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.165459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.165838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.166212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.166605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.168347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.169974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.171725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.173587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.173964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.173975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.176102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.176481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.177392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.178865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.179139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.180844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.181899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.183585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.185444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.185717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.185727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.188863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.190351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.463 [2024-07-15 08:07:30.192054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.193761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.194287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.196012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.197866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.199639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.201303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.201753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.201764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.205434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.207153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.464 [2024-07-15 08:07:30.208116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.726 [2024-07-15 08:07:30.209666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.726 [2024-07-15 08:07:30.209946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.726 [2024-07-15 08:07:30.211665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.726 [2024-07-15 08:07:30.213232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.726 [2024-07-15 08:07:30.213607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.726 [2024-07-15 08:07:30.213988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.214486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.214497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.217232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.218718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.220407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.222118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.222609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.223006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.223382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.223758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.225217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.225515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.225525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.228809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.230403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.230785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.231160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.231628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.232104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.233584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.235289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.236993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.237324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.237334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.239317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.239695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.240073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.241631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.241948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.243662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.245363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.246303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.247779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.248050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.248061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.250322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.252012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.253794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.255508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.255790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.257107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.258591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.260284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.261993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.262358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.262371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.266278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.268039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.269679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.271037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.271361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.273072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.274785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.275464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.275841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.276218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.276228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.279135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.280998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.282690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.284494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.284774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.285154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.285528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.285905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.286467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.286807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.286816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.290293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.291996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.292698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.293077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.293502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.293884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.295380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.296863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.298560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.298836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.298847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.300964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.301347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.301722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.302977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.303302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.305014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.306726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.307686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.309169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.309442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.309452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.311761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.313572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.315233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.317015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.317291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.318380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.727 [2024-07-15 08:07:30.319862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.321570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.323279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.323727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.323737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.327559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.329346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.331172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.332387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.332712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.334427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.336135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.336954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.337329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.337722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.337733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.340656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.342494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.344234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.346068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.346343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.346723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.347097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.347471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.348223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.348516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.348526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.351939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.353655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.354174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.354548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.354984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.355361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.357035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.358654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.360425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.360696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.360706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.362779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.363156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.363530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.364908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.365224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.366941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.368653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.369618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.371101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.371373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.371382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.373658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.375342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.377173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.378909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.379184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.380448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.381935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.383631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.385345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.385727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.385737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.389640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.391355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.392843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.394391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.394705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.396424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.398139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.398613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.398989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.399461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.399471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.402126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.403629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.405338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.407048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.407401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.407782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.408157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.408531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.409878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.410237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.410247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.413671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.415411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.415789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.416163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.416620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.417075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.418567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.420263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.421970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.422298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.422308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.424292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.424668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.425046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.426778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.427111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.428828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.430540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.728 [2024-07-15 08:07:30.431545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.433029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.433307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.433318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.435836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.437323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.438937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.440339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.440637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.442103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.443720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.444572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.445940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.446368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.446379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.449914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.451630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.452588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.454068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.454343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.456061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.457533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.457912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.458289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.458771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.458784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.461539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.463037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.464776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.466491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.466983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.467361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.467739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.468117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.468490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.469024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.469034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.471597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.471978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.472352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.472728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.473177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.473554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.473931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.474303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.474677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.475117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.475128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.477807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.478195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.478231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.478604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.479052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.729 [2024-07-15 08:07:30.479431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.479813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.480191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.480564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.481034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.481045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.483642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.484024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.484417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.484453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.484878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.485257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.485630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.486008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.486387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.486889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.486900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.489762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.490126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.490136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.492357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.492395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.492430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.492466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.492943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.492984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.493021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.493056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.493092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.493461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.493471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.495713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.495751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.495792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.495827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.496187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.496228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.496264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.496299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.496339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.496740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.496750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.499149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.499188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.991 [2024-07-15 08:07:30.499225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.499261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.499720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.499761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.499797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.499832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.499868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.500244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.500253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.502682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.502724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.502760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.502795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.503181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.503226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.503262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.503297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.503333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.503729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.503743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.505934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.505973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.506009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.506044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.506422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.506469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.506505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.506543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.506579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.507088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.507098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.509664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.509716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.509752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.509788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.510166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.510207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.510244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.510279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.510315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.510784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.510794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.513686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.514185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.514196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.516385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.516434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.516470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.516505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.516958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.517003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.517039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.517075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.517110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.517496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.517506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.519741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.519779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.519829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.519864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.520393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.520434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.520470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.520509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.520545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.521010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.521020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.523839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.524183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.524193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.526392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.526431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.526471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.526507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.526989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.527032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.527067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.527102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.527138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.527501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.527511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.529768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.529807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.992 [2024-07-15 08:07:30.529842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.529878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.530227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.530270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.530306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.530341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.530380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.530778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.530789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.533903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.534280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.534289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.536797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.536835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.536871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.536906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.537271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.537311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.537347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.537382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.537417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.537827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.537837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.540747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.541019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.541029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.544647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.544693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.544741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.544778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.545250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.545294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.545330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.545365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.545401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.545794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.545804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.548664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.548705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.548744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.548779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.549048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.549088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.549124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.549161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.549196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.549464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.549473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.554653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.554694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.554733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.554769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.555149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.555189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.555225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.555260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.555309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.555804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.555814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.560091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.560132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.560492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.560537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.560572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.560872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.725400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.725459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.725811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.725865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.726214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.726256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.727339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.727628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.727637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.727646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.736210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.737946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.739777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.740049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.740059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.742144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.742522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.742900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:45.993 [2024-07-15 08:07:30.744464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.746458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.748167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.749130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.750613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.750892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.750903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.753339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.754828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.756536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.758246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.760035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.761520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.763228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.764937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.765389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.765399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.769158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.770877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.772055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.773864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.775840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.777620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.778000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.778373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.778884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.778894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.781560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.783046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.784760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.786469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.787236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.787612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.787988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.789566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.789899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.789909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.793193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.794571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.794947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.795320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.796583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.798071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.799787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.801493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.801919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.801929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.803880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.804260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.804634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.806271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.808257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.808993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.810698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.812414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.812827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.812837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.816636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.818352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.819914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.820872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.822894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.824606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.256 [2024-07-15 08:07:30.825856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.826230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.826597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.826611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.830395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.831239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.832729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.834466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.835883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.836258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.836631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.837007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.837374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.837384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.839860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.840236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.840611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.840990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.841758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.842135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.842509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.842885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.843263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.843273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.845727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.846108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.846483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.846860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.847697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.848084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.848459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.848839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.849287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.849297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.851874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.852250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.852623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.852999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.853767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.854143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.854517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.854893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.855291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.855302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.857949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.857988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.858369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.858746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.859528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.859905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.860277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.860650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.861155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.861165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.863929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.864305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.864678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.864729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.865633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.866011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.866385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.866759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.867214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.867224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.869718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.869756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.870129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.870165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.870927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.870966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.871340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.871376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.871821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.871831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.874454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.874493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.874868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.874905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.875778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.875823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.876196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.876232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.876594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.876605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.879152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.879191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.879565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.879604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.880369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.880407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.880784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.880821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.881212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.881222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.883666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.883713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.884088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.884124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.257 [2024-07-15 08:07:30.884936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.884976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.885349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.885385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.885805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.885815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.888372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.888411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.888789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.888826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.889670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.889707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.890084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.890121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.890449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.890459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.892932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.892972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.893353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.893390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.894206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.894244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.894617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.894657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.895034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.895045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.897635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.897693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.898069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.898106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.898882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.898921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.899294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.899342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.899612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.899623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.903176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.903218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.903592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.903630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.904468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.904507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.904903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.904944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.905371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.905381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.908435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.908474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.909284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.909321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.911344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.911383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.911757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.911796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.912144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.912154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.915270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.915309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.917050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.917094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.917735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.917774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.918146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.918183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.918666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.918677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.921497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.921537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.923018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.923054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.925032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.925070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.926106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.926143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.926603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.926612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.930138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.930177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.931886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.931922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.933153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.933192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.934671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.934708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.934983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.934994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.937611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.937650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.939136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.939173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.941153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.941193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.942164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.942201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.942533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.942543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.944634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.944685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.258 [2024-07-15 08:07:30.946373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.946410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.947746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.947786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.949271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.949309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.949578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.949589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.952908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.952947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.953321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.953357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.954141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.954180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.954948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.954985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.955352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.955362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.958738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.958777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.960486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.960531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.962232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.962271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.962644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.962680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.962956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.962967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.966681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.966725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.967881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.967918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.969968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.970006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.971716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.971753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.972128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.972138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.975922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.975961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.977670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.977698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.979554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.979593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.979629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.981425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.981697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.981707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.983931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.983969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.984004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.984043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.984556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.984593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.984629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.984664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.985047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.985057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.986670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.986707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.986747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.986782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.987125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.987162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.987202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.987238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.987507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.987517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.989577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.989615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.989651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.989686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.990002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.990041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.990077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.990112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.990380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.990389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.992868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.995190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.995229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.995264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.995298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.995640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.995678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.995718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.995754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.996021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.996030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.997683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.259 [2024-07-15 08:07:30.997725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:30.997761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:30.997796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:30.998177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:30.998215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:30.998261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:30.998297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:30.998789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:30.998802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.000679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.000721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.000757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.000795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.001097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.001138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.001174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.001210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.001714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.001725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.003380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.003418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.003454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.003490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.004062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.004100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.004136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.004172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.004509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.004520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.006406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.006444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.006480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.006515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.006816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.006854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.006890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.006928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.007252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.007262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.009040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.009078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.009115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.260 [2024-07-15 08:07:31.009152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.009534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.009576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.009611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.009647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.010036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.010046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.011658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.011696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.011735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.011774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.012112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.012149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.012185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.012220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.012487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.012497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.522 [2024-07-15 08:07:31.014478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.014516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.014551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.014586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.014988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.015026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.015063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.015098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.015491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.015501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.018542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.018582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.018617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.018652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.018955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.018994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.019033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.019069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.019340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.019350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.021788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.021827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.023306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.023344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.023644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.023681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.023720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.023757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.024025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.024036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.027020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.027060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.027490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.027527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.027925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.029068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.029106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.029478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.029839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.029850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.032930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.032970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.034678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.034720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.035021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.035396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.035436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.035814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.036300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.036310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.038849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.038890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.040372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.040410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.040716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.042425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.042463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.043361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.043730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.043742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.047324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.047364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.049070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.049108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.049410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.050361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.050399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.051630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.051946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.051957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.054304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.054343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.056197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.056235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.056539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.058286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.058324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.059727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.060026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.060036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.061964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.062005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.063144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.063182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.063652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.064030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.064067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.065549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.065825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.065835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.069108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.069148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.069823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.069859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.070330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.070703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.070744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.071117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.523 [2024-07-15 08:07:31.071387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.071397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.074760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.074799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.076508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.076545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.076978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.078756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.078794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.079171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.079488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.079498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.083089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.083128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.084088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.084132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.084490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.086205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.086242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.087951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.088346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.088356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.092220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.092260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.093957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.093995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.094299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.095498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.095535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.097016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.097289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.097299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.101066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.101105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.101478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.101515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.101855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.103342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.103379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.105088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.105364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.105374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.107400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.107439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.107815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.107852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.108357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.109401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.109438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.110917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.111190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.111201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.114604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.114644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.115875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.115912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.116401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.116779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.116817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.118323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.118794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.118804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.121780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.121818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.123332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.123370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.123674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.125459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.125496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.125872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.126265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.126282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.130053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.130093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.131060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.131097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.131428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.133124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.133161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.134870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.135364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.135374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.137930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.137969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.139452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.139489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.139795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.140794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.140832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.142316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.142586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.142596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.145930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.145969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.147451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.147481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.147789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.149578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.149615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.524 [2024-07-15 08:07:31.150918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.151268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.151281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.152998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.153524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.153561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.153596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.154627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.155026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.155068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.156657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.158158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.158195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.158467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.158477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.161430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.161810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.162184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.162557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.162875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.164363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.166099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.167813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.168490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.168763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.168775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.170818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.171851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.172778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.173161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.173569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.173956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.174331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.174712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.175086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.175555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.175566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.177857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.178234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.178608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.178998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.179362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.179744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.180119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.180491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.182271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.182780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.182790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.185126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.185503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.185882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.186255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.186684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.187065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.188704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.189085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.190260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.190712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.190722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.193312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.193689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.194069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.194918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.195275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.195653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.197086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.197615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.197992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.198561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.198571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.201344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.202270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.203307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.203680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.203957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.204338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.204717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.205089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.205462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.205849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.205859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.208987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.209430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.210944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.211317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.211639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.212020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.212394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.212769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.213145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.213502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.213513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.216904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.217283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.217660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.218036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.218490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.218873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.219248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.525 [2024-07-15 08:07:31.219621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.220104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.220377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.220387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.223015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.223392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.223768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.224141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.224525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.224906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.225835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.226865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.227238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.227509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.227520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.230343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.230736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.231110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.231482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.231799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.232532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.232924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.234497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.234872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.235298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.235313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.237784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.238166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.239938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.240312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.240683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.242185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.242561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.242937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.243316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.243852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.243863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.247681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.248061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.248938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.248976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.249381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.249763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.250138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.250511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.250886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.251294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.251305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.253647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.255069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.255606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.255987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.256553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.256935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.257309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.257681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.258061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.258476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.258485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.260834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.260874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.261788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.261826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.262187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.263497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.264133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.264508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.264884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.265262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.265273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.268521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.268562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.269061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.269099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.269487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.269869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.269908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.271118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.271156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.271641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.271652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.274048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.526 [2024-07-15 08:07:31.274099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.275769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.275808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.276267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.277095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.277139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.278028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.278068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.278465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.278479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.281542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.281583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.283288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.283326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.283598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.284624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.284664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.789 [2024-07-15 08:07:31.286255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.286294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.286732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.286743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.290265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.290305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.291930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.291967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.292322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.294159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.294198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.295984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.296021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.296291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.296302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.299219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.299258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.299633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.299672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.299996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.301718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.301756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.303468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.303505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.303880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.303894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.307170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.307210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.307585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.307622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.307938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.308775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.308814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.309375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.309412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.309706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.309720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.313095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.313135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.314843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.314881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.315327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.316921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.316959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.317332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.317369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.317655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.317665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.321369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.321413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.322371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.322408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.322766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.324478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.324518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.326226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.326264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.326738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.326749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.329350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.329391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.330866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.330904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.331175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.332885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.332924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.333890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.333927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.334236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.334246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.336358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.336398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.338235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.338273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.338777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.339477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.339514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.790 [2024-07-15 08:07:31.341008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.341046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.341321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.341331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.344654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.344697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.345795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.345832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.346268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.346644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.346682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.348525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.348562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.349035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.349046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.351748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.351788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.353266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.353304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.353573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.355283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.355321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.356288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.356325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.356731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.356741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.360184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.360224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.361926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.361963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.362235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.363206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.363244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.364730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.364769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.365039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.365049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.368807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.368848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.369221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.369257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.369566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.371053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.371091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.372797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.372835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.373106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.373116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.375951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.375991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.376586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.376623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.377006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.378776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.378815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.379188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.379224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.379527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.379536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.382663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.382703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.384537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.384576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.384858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.386048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.386086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.386636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.386673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.387054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.387064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.390636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.390676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.392126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.393724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.394053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.395751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.791 [2024-07-15 08:07:31.395798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.397514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.397553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.397950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.397960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.399986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.401259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.402739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.402777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.403048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.404762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.404801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.404837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.406012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.406317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.406327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.408700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.409139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.409150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.410842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.410880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.410916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.410950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.411351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.411390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.411426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.411462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.411497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.411796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.411806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.413494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.413533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.413569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.413605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.413934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.413975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.414011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.414046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.414081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.414531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.414540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.416780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.417050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.417060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.418784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.418823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.418859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.418894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.419181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.419221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.419257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.419295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.419330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.419821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.419833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.421976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.422247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.422257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.424056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.424096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.424135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.792 [2024-07-15 08:07:31.424185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.424453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.424495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.424531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.424566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.424602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.425019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.425031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.426625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.426664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.426702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.426743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.427070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.427115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.427152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.427187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.427223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.427493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.427502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.429958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.430341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.430354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.431953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.431992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.432800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.434801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.434840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.434879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.434914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.435323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.435362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.435398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.435434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.435471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.435852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.435862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.437475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.437514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.437550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.439024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.439294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.439336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.439373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.439408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.439444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.439717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.439728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.441628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.441667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.441703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.441744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.442138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.442178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.442214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.442249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.442285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.793 [2024-07-15 08:07:31.442591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.442608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.444355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.446052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.446090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.447803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.448202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.448242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.448278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.448313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.448348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.448751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.448761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.450847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.452336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.452374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.454079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.454353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.454393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.455421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.455459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.456938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.457209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.457220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.459179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.460970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.461008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.461381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.461705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.461751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.463234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.463272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.464975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.465247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.465257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.466989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.468221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.468258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.468770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.469168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.469221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.470897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.470935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.471307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.471613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.471623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.473302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.474982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.475021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.476739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.477010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.477050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.478339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.478376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.478838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.479216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.479226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.481030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.482737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.482775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.484069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.484343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.484384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.485980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.486018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.487834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.488108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.488118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.489988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.490364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.490401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.492212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.492512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.492552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.494325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.494370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.495966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.496285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.496295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.497974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.794 [2024-07-15 08:07:31.498350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.498389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.499399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.499770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.499818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.500192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.500228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.501859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.502133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.502143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.503736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.505438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.505477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.506330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.506602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.506643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.507021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.507061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.508490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.508978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.508989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.510651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.511616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.511654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.513129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.513405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.513446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.515161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.515199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.516045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.516371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.516380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.519317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.521032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.521071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.522139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.522452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.522493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.524181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.524219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.525930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.526455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.526466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.528354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.528891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.528929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.530633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.530911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.530951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.531900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.531937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.533701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.533981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.533992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.538133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.539955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.539997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.541829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.795 [2024-07-15 08:07:31.542104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:46.796 [2024-07-15 08:07:31.542145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.543695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.543737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.545378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.545737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.545747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.547445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.548265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.548302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.549234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.549625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.549666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.551368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.551406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.553170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.553443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.553453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.558598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.559064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.559102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.559899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.560260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.560301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.560674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.560715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.562491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.562788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.562799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.564412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.566173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.566211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.567631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.568026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.568067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.569053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.569091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.569465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.569737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.569748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.573783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.574508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.574546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.574923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.575415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.575469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.576348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.576386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.577249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.577648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.577658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.579657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.581284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.581327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.581363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.581842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.581883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.582620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.582658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.058 [2024-07-15 08:07:31.583653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.584061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.584073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.587002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.587045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.587081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.587454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.587735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.587776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.588152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.588890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.588927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.589288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.589298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.591664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.592971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.593616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.593993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.594366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.595027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.596321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.596694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.598180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.598670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.598682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.602408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.602792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.603166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.604701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.605192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.605572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.607303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.607676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.608054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.608460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.608470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.610647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.611029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.611640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.612982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.613396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.614375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.615345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.615723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.616098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.616471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.616481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.621455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.622454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.622833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.624628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.625127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.625506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.625885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.626968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.627842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.628229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.628241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.631530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.632091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.632643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.634032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.634480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.634861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.635236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.636748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.637191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.637575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.637585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.642340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.643630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.059 [2024-07-15 08:07:31.644009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.644383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.644770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.646433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.646817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.647558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.648774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.649146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.649156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.652012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.653072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.653448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.653840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.654384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.656186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.656563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.657384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.658507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.658903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.658913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.661834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.662218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.663156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.664175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.664564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.665801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.666516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.666893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.667279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.667641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.667651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.669937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.670315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.671765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.672257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.672638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.674413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.674793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.675167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.675540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.675832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.675843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.680418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.680803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.681661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.682762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.683139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.683517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.683895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.685715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.686090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.686447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.686457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.690061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.690975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.692015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.692390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.692661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.693172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.693550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.693925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.695147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.695566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.695576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.698512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.699232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.700472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.700510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.700898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.701276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.701650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.703404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.703783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.704120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.704130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.707505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.707887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.060 [2024-07-15 08:07:31.709417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.709838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.710218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.711864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.713639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.715348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.716856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.717202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.717211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.723643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.723685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.724062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.724099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.724413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.725902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.727601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.729313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.730270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.730584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.730594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.732678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.732723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.734445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.734482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.735036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.735613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.735650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.737123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.737161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.737431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.737441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.744107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.744149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.744522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.744559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.744838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.745403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.745440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.746257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.746295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.746625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.746634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.749985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.750025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.751730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.751768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.752159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.753550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.753588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.753965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.754002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.754276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.754287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.759488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.759531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.761241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.761279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.761551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.762397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.762435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.763497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.763534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.763920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.763930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.767396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.767437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.769148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.769187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.769550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.771038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.771076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.772774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.772811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.773080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.773090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.779160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.779202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.780910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.780947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.781220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.782273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.782311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.061 [2024-07-15 08:07:31.783791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.783828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.784099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.784109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.787249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.787290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.787664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.787701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.788054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.789545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.789584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.791277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.791315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.791584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.791595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.796320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.796363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.797906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.797944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.798430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.798899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.798937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.800419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.800457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.800731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.800742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.804044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.804084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.804964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.805000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.805404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.805786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.805825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.807522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.807559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.808096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.062 [2024-07-15 08:07:31.808107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.814077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.814120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.815827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.815864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.816400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.818244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.818283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.818654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.818697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.819014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.819024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.822572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.822613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.823567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.327 [2024-07-15 08:07:31.823605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.823933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.825649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.825688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.827396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.827434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.827838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.827848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.834764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.834807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.836518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.836557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.836831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.838032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.838071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.839545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.839582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.839857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.839869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.843570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.843610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.843986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.844024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.844319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.845805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.845846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.847557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.847595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.847866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.847877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.853047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.853102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.854850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.854888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.855371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.856174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.856212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.857691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.857733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.858004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.858015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.861374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.861415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.862586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.862623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.863075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.863454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.863501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.865289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.865327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.865803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.865814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.871937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.871979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.873689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.873734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.874140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.874519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.874557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.874934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.874971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.875359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.875369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.878424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.878464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.880156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.881858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.882197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.882574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.882611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.882987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.883023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.883458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.883468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.888311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.890012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.328 [2024-07-15 08:07:31.891770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.891807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.892302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.892681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.892723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.892759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.893133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.893509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.893519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.895746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.896014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.896023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.899811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.899851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.899886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.899934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.900201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.900241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.900277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.900313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.900348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.900770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.900780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.902396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.902434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.902469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.902505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.902876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.902915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.902951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.902986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.903022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.903500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.903513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.907529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.907569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.907605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.907640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.907909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.907949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.907984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.908021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.908057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.908324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.908334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.910965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.911810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.916957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.917972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.329 [2024-07-15 08:07:31.919586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.919624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.919663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.919699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.920009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.920050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.920086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.920121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.920157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.920425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.920434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.924669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.924714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.924750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.924785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.925055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.925097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.925134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.925169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.925204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.925473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.925483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.927949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.928346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.928355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.932835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.935745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.936012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.936021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.941282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.941324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.941363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.941740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.942252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.942296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.942332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.942368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.942404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.942723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.942741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.944914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.945183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.945192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.330 [2024-07-15 08:07:31.948176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.949882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.949920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.950880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.951153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.951193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.951230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.951265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.951301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.951568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.951577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.953496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.953952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.953989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.955465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.955741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.955782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.957161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.957198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.959015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.959305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.959315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.963439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.965152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.965190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.966897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.967251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.967300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.968960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.969000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.970707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.970980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.970990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.973022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.974089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.974126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.975607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.975883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.975931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.977635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.977672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.978734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.979049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.979058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.983053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.984766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.984809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.986511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.986787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.986830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.988219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.988256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.990051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.990350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.990360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.992360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.992742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.992779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.994201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.994503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.994544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.996278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.996315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.998080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.998639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:31.998648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.002987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.003365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.003403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.003780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.004176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.004216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.004593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.004631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.005005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.331 [2024-07-15 08:07:32.005400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.005410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.007548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.007928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.007968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.008343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.008853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.008893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.009264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.009301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.009673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.010045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.010055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.012943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.013321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.013358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.013735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.014218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.014260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.014634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.014670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.015046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.015561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.015571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.018633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.019014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.019052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.019425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.019797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.019845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.020218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.020255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.020627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.021028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.021039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.023923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.024302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.024339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.024714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.025130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.025170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.025542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.025578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.025955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.026341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.026351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.029398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.029779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.029816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.030189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.030579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.030620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.031000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.031038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.031411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.031934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.031944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.034968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.035358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.035396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.035771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.036164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.036205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.036590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.036626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.037002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.037433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.037443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.040370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.332 [2024-07-15 08:07:32.040757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.040795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.041167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.041520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.041560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.041936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.041986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.042358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.042719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.042729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.045569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.045951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.045988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.046368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.046805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.046846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.047219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.047256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.047628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.048031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.048045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.050923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.051302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.051339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.051717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.052106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.052145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.052518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.052555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.052937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.053357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.053367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.056156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.056535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.056574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.056966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.057398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.057438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.057813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.057850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.058236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.058601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.058611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.061497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.061878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.061916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.061951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.062330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.062370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.062746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.062786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.063859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.064185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.064194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.068567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.068608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.068644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.333 [2024-07-15 08:07:32.069020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.069465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.069506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.069883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.070261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.070298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.070683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.070692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.074076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.074454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.074832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.075206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.075587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.075966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.334 [2024-07-15 08:07:32.076540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.078010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.079718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.079992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.080002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.084967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.085347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.086847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.088327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.088599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.090315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.091286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.092766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.094470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.094744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.094755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.099803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.101466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.102847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.104329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.104604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.106320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.106903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.107277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.107649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.108046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.108056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.114242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.115609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.115987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.116360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.116833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.117700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.119180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.120890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.122598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.123005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.123015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.128254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.129623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.131106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.132811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.133085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.134060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.135544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.137246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.138949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.139328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.139339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.144922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.146189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.147669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.149374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.149644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.150369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.150748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.151120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.151492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.151769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.151779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.158007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.158390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.158767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.159140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.159441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.160928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.162641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.164351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.165326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.165664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.165674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.171444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.173114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.174897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.176729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.177113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.178604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.611 [2024-07-15 08:07:32.180311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.182011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.182966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.183504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.183516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.188408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.189896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.191609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.193316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.193695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.194078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.194454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.194831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.196676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.196952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.196962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.202347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.202731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.203107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.204336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.204668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.206381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.208090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.209051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.210530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.210810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.210821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.216328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.218042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.219763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.220828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.221131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.222844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.224553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.225528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.225908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.226274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.226284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.231510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.233243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.235077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.235114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.235619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.236000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.236374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.236925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.238406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.238676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.238686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.244112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.244491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.244868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.246515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.246867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.248579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.250288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.251250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.252728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.253002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.253019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.258834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.258889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.260598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.260636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.260994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.262447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.263925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.265629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.267328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.267769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.267780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.273020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.273061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.274841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.274878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.275148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.277003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.277040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.278580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.278616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.279075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.279085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.283910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.283952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.285433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.285470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.285749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.287462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.287500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.287995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.288031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.288420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.288430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.293882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.293928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.295637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.295675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.295946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.297312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.612 [2024-07-15 08:07:32.297350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.297727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.297764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.298117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.298127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.303394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.303436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.305267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.305305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.305574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.305956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.305993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.306366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.306402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.306873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.306883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.313083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.313129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.314840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.314879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.315276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.315653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.315691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.316069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.316107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.316488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.316498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.322580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.322623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.323734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.323782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.324311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.324689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.324730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.325104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.325142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.325427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.325437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.331827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.331877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.332250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.332287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.332667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.333065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.333103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.334238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.334275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.334640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.334654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.339769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.339811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.340185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.340221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.340738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.341481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.341518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.342998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.343035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.343305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.343316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.348876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.348919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.349293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.349330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.349723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.351346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.351385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.353136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.353174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.353486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.353496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.357728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.357769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.359253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.359291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.359564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.361287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.361326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.362260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.362297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.362615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.613 [2024-07-15 08:07:32.362625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.875 [2024-07-15 08:07:32.368406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.875 [2024-07-15 08:07:32.368449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.370146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.370183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.370456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.371888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.371926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.373681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.373721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.373991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.374001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.379686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.379731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.381439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.381477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.381751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.382665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.382702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.384182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.384220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.384489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.384500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.390378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.390420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.392131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.392168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.392545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.394013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.394051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.395603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.395641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.395913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.395923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.400390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.400434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.400812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.400849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.401280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.401657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.401695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.402071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.402114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.402634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.402644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.405995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.406037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.406410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.406447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.406816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.407195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.407233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.407606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.407643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.408024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.408034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.411367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.411408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.411788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.411825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.412319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.412696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.412758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.413131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.413168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.413626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.413636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.417188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.417231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.417604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.417980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.418476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.418858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.418901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.419274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.419310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.419722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.419732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.422681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.423075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.423449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.423486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.423866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.424247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.424285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.424347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.424723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.425191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.425201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.428126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.428168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.876 [2024-07-15 08:07:32.428203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.428238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.428626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.428667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.428704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.428751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.428787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.429253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.429263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.432808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.433172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.433182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.436783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.437182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.437192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.439992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.440034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.440069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.440105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.440539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.440579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.440626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.440661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.440700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.441143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.441153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.444229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.444271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.444307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.444357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.444869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.444909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.444947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.444982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.445018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.445350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.445360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.448330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.448371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.448417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.448452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.448928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.448969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.449009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.449044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.449079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.449477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.449487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.452414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.452455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.452495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.452552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.452938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.452992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.453027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.453063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.453101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.453554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.453564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.456740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.456781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.456816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.456851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.457244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.457286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.457323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.457358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.457394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.457792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.457802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.460572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.460613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.460648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.460683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.461149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.461190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.461225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.461260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.461296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.461716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.461727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.464629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.464670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.464705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.877 [2024-07-15 08:07:32.464744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.465151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.465191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.465227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.465262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.465298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.465684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.465694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.468698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.468741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.468781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.468817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.469283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.469323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.469359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.469396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.469432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.469808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.469817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.472784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.472828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.472877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.473250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.473609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.473653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.473689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.473727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.473763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.474156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.474166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.478706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.478754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.478789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.478825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.479093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.479133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.479169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.479205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.479241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.479794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.479804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.483639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.484593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.484631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.486109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.486384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.486425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.486461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.486496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.486532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.486804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.486817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.489987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.491702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.491742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.492804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.493111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.493150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.494853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.494890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.496592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.497117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.497127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.501089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.502089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.502127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.503608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.503885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.503932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.505637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.505675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.506657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.507141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.507151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.510861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.512436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.512473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.514111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.514382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.514423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.516285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.516326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.516700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.517096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.517106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.520924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.522414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.878 [2024-07-15 08:07:32.522451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.524155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.524427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.524467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.524845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.524882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.525254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.525746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.525756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.529734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.531449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.531486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.533193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.533715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.533770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.534144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.534180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.534553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.534959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.534970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.539503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.541196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.541234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.542167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.542678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.542726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.543100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.543137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.543509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.543826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.543836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.548535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.549912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.549951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.550326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.550691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.550736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.551110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.551146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.552538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.552858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.552868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.557676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.558059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.558096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.558469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.558944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.558985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.559902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.559939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.561416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.561689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.561698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.566924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.567302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.567338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.567720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.568118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.568159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.569728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.569766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.571459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.571735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.571746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.576417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.576799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.576836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.577209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.577489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.577528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.579050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.579087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.580929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.581199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.581209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.586247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.586640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.586677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.587733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.588031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.588071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.589782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.589819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.591526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.592071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.592081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.596251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.596630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.596667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.598461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.598737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.598778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.600636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.600674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.602198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.602528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.602538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.606196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.607292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.607329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.879 [2024-07-15 08:07:32.608811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.609085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.609125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.610834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.610870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.611894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.612213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.612223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.616226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.617873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.617911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.619616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.619892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.619934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.621221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.621258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.623040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.623317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.623327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:47.880 [2024-07-15 08:07:32.627510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.628989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.629028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.630729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.631003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.631043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.632116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.632153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.633632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.633905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.633916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.638203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.639914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.639951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.641665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.641997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.642037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.643831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.643876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.645740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.646009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.646020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.650082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.651782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.651819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.651855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.652125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.652172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.653430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.653467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.654947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.655217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.655227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.661129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.661171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.661206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.662916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.663341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.663381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.665173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.666925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.666970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.667242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.667252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.672801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.674518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.675488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.676962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.677236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.678955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.680071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.680458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.680833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.681278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.681289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.687720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.689429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.689957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.690332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.690799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.691178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.692932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.694550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.696278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.696547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.696556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.701949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.702386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.703900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.705602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.705877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.707455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.708919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.710407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.712112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.712385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.712395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.717050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.144 [2024-07-15 08:07:32.718012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.719490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.721193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.721467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.722850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.723226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.723599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.723974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.724297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.724307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.729265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.729647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.730026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.731849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.732138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.733854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.735575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.736641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.738124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.738394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.738404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.744212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.746043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.747833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.749101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.749453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.751170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.752880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.753686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.754061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.754459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.754469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.757490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.759209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.760810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.762539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.762813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.763193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.763567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.763942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.764553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.764908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.764922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.768343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.770052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.770969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.771342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.771695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.772077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.773473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.774960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.776694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.776968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.776978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.779132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.779509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.779885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.780259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.780679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.781062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.781436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.781811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.782183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.782670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.782681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.785320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.785697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.786074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.786446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.786829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.787207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.787581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.787956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.788335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.788730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.788742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.791252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.791632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.792013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.792387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.792810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.793188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.793562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.793940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.794318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.794762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.794773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.797327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.797703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.798079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.798117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.798515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.798896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.799270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.799641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.800019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.800405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.800415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.145 [2024-07-15 08:07:32.802990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.803367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.803743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.804115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.804602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.804989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.805365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.805739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.806111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.806573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.806583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.809101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.809140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.809514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.809551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.810009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.810386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.810764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.811138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.811514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.811998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.812009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.814486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.814525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.814902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.814940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.815324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.815701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.815754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.816128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.816165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.816579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.816588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.819071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.819110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.819483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.819523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.819901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.820278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.820316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.820707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.820748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.821137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.821147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.823763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.823801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.824175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.824212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.824592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.824975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.825016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.825388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.825424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.825794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.825805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.828524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.828563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.828949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.828986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.829383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.829766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.829804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.830176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.830214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.830584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.830594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.833474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.833521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.833900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.833938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.834311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.834689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.834729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.835101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.835138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.835524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.835535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.838221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.838261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.838635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.838671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.839073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.840429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.840467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.841777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.841814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.842259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.842269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.844912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.844951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.845325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.845361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.845756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.846134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.846172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.846545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.846582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.846939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.146 [2024-07-15 08:07:32.846949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.849458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.849498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.849874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.849911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.850412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.850793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.850831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.851205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.851241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.851686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.851695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.854088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.854127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.854500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.854549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.855042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.855544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.855582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.857074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.857111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.857381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.857392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.860720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.860759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.861133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.861170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.861561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.861942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.861984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.862485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.862521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.862831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.862841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.866310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.866350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.868060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.868097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.868465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.868845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.868883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.869256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.869292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.869690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.869701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.872933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.872974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.874673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.874713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.874985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.876301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.876339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.876726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.876764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.877123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.877133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.880751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.880791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.882243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.882281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.882639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.884357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.884395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.886181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.886219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.886719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.886729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.890382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.890421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.892132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.892170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.892525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.894376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.894414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.896249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.896286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.896556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.147 [2024-07-15 08:07:32.896567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.899788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.899828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.901310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.901347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.901620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.903336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.903374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.904537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.904575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.904887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.904898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.907151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.907191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.908108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.908145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.908447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.910162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.910200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.911907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.911946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.912321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.912330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.914333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.914373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.914751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.915375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.915746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.917493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.917531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.919236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.919273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.919605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.919615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.921224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.921601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.921980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.922018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.922402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.923947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.923986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.924021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.925812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.926084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.926098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.927683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.927726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.927762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.927797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.928166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.928206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.928242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.928277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.928313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.928700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.928714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.930595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.930634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.930669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.930704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.930978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.931018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.411 [2024-07-15 08:07:32.931055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.931091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.931127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.931451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.931461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.933853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.934248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.934260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.935896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.935934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.935970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.936005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.936371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.936410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.936446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.936481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.936516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.936786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.936797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.938751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.938790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.938828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.938863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.939191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.939236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.939272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.939308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.939344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.939686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.939696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.941803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.942178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.942188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.944747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.945015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.945025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.946695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.946738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.946774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.946812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.947334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.947378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.947414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.947451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.947486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.947817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.947827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.949563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.949601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.949636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.949674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.950152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.950192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.950227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.950262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.950298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.950661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.950671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.952611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.952649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.952684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.952734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.953212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.953251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.953287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.953323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.953358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.953691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.953701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.955359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.955397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.955436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.955472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.955743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.412 [2024-07-15 08:07:32.955783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.955826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.955862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.955897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.956166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.956176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.958386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.958424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.958459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.960167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.960439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.960483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.960519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.960555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.960590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.960861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.960872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.962552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.962590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.962625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.962660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.963137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.963177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.963213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.963248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.963284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.963666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.963679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.965702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.967411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.967448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.968421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.968771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.968811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.968847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.968882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.968917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.969187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.969197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.971173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.972009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.972047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.973529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.973804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.973844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.975556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.975593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.976545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.976848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.976858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.978801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.979178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.979214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.979808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.980177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.980217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.981925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.981962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.983676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.984017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.984027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.985633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.986012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.986048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.986421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.986795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.986836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.988586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.988624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.990417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.990690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.990699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.992279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.993390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.993442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.993819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.994180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.994220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.994592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.994629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.995804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.996152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.996162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.997879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.999594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:32.999631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.001360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.001870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.001911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.002284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.002320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.002693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.003089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.003102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.004715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.006197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.006235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.007939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.008216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.413 [2024-07-15 08:07:33.008256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.009209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.009248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.009620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.009960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.009970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.011678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.012947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.012984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.014719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.014990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.015031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.016810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.016847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.018314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.018799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.018810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.020737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.022430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.022467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.024174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.024655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.024695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.026180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.026217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.027906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.028177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.028187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.030320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.031888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.031931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.033589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.033864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.033904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.035581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.035619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.037157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.037451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.037461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.039358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.039737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.039774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.040941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.041241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.041281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.042985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.043022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.044733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.045080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.045089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.046798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.047174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.047210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.047593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.047961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.048002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.049813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.049851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.051390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.051699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.051716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.053550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.053929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.053966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.054345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.054695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.054738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.056218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.056255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.057965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.058236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.058246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.059973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.060356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.060394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.060769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.061240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.061280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.061652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.061689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.063153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.063425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.063435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.065040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.066750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.066787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.067283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.067718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.067761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.068133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.068170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.068545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.068818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.068829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.070493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.072208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.072245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.073955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.074360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.074400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.414 [2024-07-15 08:07:33.074790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.074827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.075199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.075590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.075602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.077220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.079037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.079074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.080849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.081122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.081162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.082675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.082715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.083088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.083471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.083480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.085680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.087395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.087432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.087468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.087912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.087956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.089437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.089473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.091206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.091478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.091488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.093918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.093957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.094016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.094392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.094811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.094854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.095227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.095600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.095637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.096080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.096090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.098636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.099015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.099389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.099764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.100144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.100522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.100899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.101271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.101644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.102059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.102071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.104731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.105107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.105481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.105868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.106256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.106634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.107012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.107383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.107759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.108137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.108147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.110945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.111323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.111697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.112074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.112551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.112931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.113306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.113679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.114058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.114559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.114570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.117045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.117421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.117797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.118171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.118713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.119091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.119466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.119841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.415 [2024-07-15 08:07:33.120214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.120664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.120675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.123209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.123585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.123976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.124349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.124793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.125171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.125546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.125921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.126294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.126831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.126842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.130570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.130949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.131326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.131700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.132125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.132502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.134014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.134465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.134841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.135325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.135335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.137807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.138187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.138561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.138937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.139209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.139591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.139969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.140342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.140729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.141110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.141120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.143520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.144199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.145478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.145854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.146230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.146608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.146986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.147672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.148947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.149392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.149403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.152939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.153316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.153690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.154067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.154541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.155146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.156498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.156875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.157248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.157671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.157681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.160894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.161752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.416 [2024-07-15 08:07:33.163443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.163820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.164226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.165579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.166169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.166547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.166924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.167311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.167321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.170052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.170433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.170831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.172370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.172859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.173236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.173611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.173987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.174544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.174824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.174836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.177680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.178872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.179247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.179284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.179684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.180065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.180443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.180819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.181193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.181615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.181625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.185657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.187427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.189285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.191049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.191489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.192974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.194680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.196385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.197246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.197797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.197809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.201328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.201368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.202704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.202745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.203017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.204589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.206333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.208184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.208557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.208941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.208952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.212734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.212775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.213744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.213781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.214141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.215861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.215900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.217610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.217649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.218102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.218114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.222010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.222058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.223769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.223808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.224179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.225640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.225679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.227237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.227275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.227544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.227554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.230763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.230804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.232289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.232327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.232597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.234312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.234354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.235550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.678 [2024-07-15 08:07:33.235587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.235895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.235906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.238129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.238169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.239424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.239462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.239754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.241441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.241480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.243191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.243229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.243652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.243666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.245730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.245770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.246143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.246179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.246499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.247991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.248029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.249743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.249780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.250051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.250061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.252130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.252170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.252543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.252580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.253010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.253966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.254004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.255484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.255522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.255796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.255806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.259223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.259263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.259636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.259673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.260050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.260433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.260472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.261546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.261587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.261928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.261938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.265216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.265260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.267054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.267092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.267505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.267887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.267941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.268314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.268350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.268677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.268688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.271790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.271829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.273534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.273572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.273849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.274227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.274264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.274636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.274673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.275155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.275166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.277931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.277970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.279446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.279483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.279757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.281467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.281508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.281883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.281920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.282317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.282327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.286244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.286284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.287259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.287297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.287597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.289297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.289336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.291041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.291078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.291457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.291467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.295417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.295458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.297173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.297210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.297524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.299100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.299139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.300801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.300842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.301112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.301122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.304388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.304429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.305912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.305953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.306227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.307983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.308022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.309278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.309316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.309611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.309621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.311789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.311830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.313062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.313099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.313398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.315107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.315145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.316853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.316891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.317344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.317354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.319416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.319455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.319832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.319869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.320173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.321659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.321697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.323393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.323431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.323702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.323716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.325742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.325785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.326159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.326195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.326602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.327771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.327810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.329287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.329325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.329593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.329604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.679 [2024-07-15 08:07:33.332871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.332911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.333284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.333657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.334092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.335182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.335220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.336704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.336746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.337016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.337026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.338661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.340195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.340572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.340608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.340977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.341355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.341392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.341428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.343064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.343431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.343441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.344984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.345023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.345058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.345093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.345360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.345400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.345436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.345472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.345507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.346031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.346041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.348837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.350463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.350514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.350549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.350584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.351051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.351092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.351129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.351168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.351204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.351692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.351703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.353966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.354250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.354259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.356879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.357242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.357252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.358965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.359823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.362883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.364572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.364611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.364647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.364682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.365121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.365164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.365200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.365236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.365271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.365638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.365647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.367591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.367631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.367669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.367705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.368053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.368093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.368130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.368166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.368201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.368564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.368574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.370429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.370468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.370504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.370546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.371078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.371117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.371155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.371191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.371226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.371550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.371561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.680 [2024-07-15 08:07:33.373751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.374019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.374032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.376294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.376333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.376369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.377843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.378114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.378157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.378193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.378229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.378264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.378800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.378811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.380636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.380675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.380715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.380751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.381214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.381253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.381289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.381327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.381363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.381719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.381729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.383373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.384867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.384905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.386613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.386889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.386930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.386967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.387003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.387042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.387503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.387514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.389440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.391188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.391236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.392879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.393232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.393273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.394757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.394795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.396503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.396779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.396791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.399366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.400991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.401028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.402729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.403004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.403044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.404323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.404361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:48.681 [2024-07-15 08:07:33.405260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:49.619 00:31:49.619 Latency(us) 00:31:49.619 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:49.619 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:49.619 Verification LBA range: start 0x0 length 0x100 00:31:49.619 crypto_ram : 5.74 44.57 2.79 0.00 0.00 2773325.98 432335.95 2103604.78 00:31:49.619 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:49.619 Verification LBA range: start 0x100 length 0x100 00:31:49.619 crypto_ram : 6.03 42.43 2.65 0.00 0.00 2936112.05 202455.83 2581110.15 00:31:49.619 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:49.619 Verification LBA range: start 0x0 length 0x100 00:31:49.619 crypto_ram1 : 5.76 47.05 2.94 0.00 0.00 2570378.24 938.93 1910021.51 00:31:49.619 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:49.619 Verification LBA range: start 0x100 length 0x100 00:31:49.619 crypto_ram1 : 6.03 42.42 2.65 0.00 0.00 2828601.50 202455.83 2361715.79 00:31:49.619 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:49.619 Verification LBA range: start 0x0 length 0x100 00:31:49.619 crypto_ram2 : 5.57 318.45 19.90 0.00 0.00 363397.59 62511.26 500090.09 00:31:49.619 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:49.619 Verification LBA range: start 0x100 length 0x100 00:31:49.619 crypto_ram2 : 5.62 253.93 15.87 0.00 0.00 447716.81 10384.94 532353.97 00:31:49.619 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:49.619 Verification LBA range: start 0x0 length 0x100 00:31:49.619 crypto_ram3 : 5.63 326.81 20.43 0.00 0.00 344405.49 19761.62 432335.95 00:31:49.619 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:49.619 Verification LBA range: start 0x100 length 0x100 00:31:49.619 crypto_ram3 : 5.76 266.49 16.66 0.00 0.00 416966.08 52025.50 380713.75 00:31:49.619 =================================================================================================================== 00:31:49.619 Total : 1342.16 83.88 0.00 0.00 714248.39 938.93 2581110.15 00:31:49.619 00:31:49.619 real 0m8.928s 00:31:49.619 user 0m17.176s 00:31:49.619 sys 0m0.304s 00:31:49.619 08:07:34 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:49.619 08:07:34 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:31:49.619 ************************************ 00:31:49.619 END TEST bdev_verify_big_io 00:31:49.619 ************************************ 00:31:49.619 08:07:34 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:31:49.619 08:07:34 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:49.619 08:07:34 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:49.619 08:07:34 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:49.619 08:07:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:49.879 ************************************ 00:31:49.879 START TEST bdev_write_zeroes 00:31:49.879 ************************************ 00:31:49.879 08:07:34 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:49.879 [2024-07-15 08:07:34.446567] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:31:49.879 [2024-07-15 08:07:34.446620] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1822777 ] 00:31:49.879 [2024-07-15 08:07:34.537170] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:49.879 [2024-07-15 08:07:34.614086] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:49.879 [2024-07-15 08:07:34.635111] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:31:50.139 [2024-07-15 08:07:34.643136] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:50.139 [2024-07-15 08:07:34.651154] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:50.139 [2024-07-15 08:07:34.733148] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:31:52.680 [2024-07-15 08:07:36.881189] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:31:52.680 [2024-07-15 08:07:36.881233] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:52.680 [2024-07-15 08:07:36.881241] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:52.680 [2024-07-15 08:07:36.889206] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:31:52.680 [2024-07-15 08:07:36.889217] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:52.680 [2024-07-15 08:07:36.889222] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:52.680 [2024-07-15 08:07:36.897226] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:31:52.680 [2024-07-15 08:07:36.897235] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:52.680 [2024-07-15 08:07:36.897241] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:52.680 [2024-07-15 08:07:36.905245] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:31:52.680 [2024-07-15 08:07:36.905255] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:52.680 [2024-07-15 08:07:36.905260] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:52.680 Running I/O for 1 seconds... 00:31:53.619 00:31:53.619 Latency(us) 00:31:53.619 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:53.619 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:53.619 crypto_ram : 1.02 2381.56 9.30 0.00 0.00 53432.06 4839.58 64527.75 00:31:53.619 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:53.619 crypto_ram1 : 1.02 2394.74 9.35 0.00 0.00 52918.04 4789.17 59688.17 00:31:53.619 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:53.619 crypto_ram2 : 1.02 18444.57 72.05 0.00 0.00 6856.86 2129.92 9124.63 00:31:53.619 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:53.619 crypto_ram3 : 1.02 18476.91 72.18 0.00 0.00 6825.77 2129.92 7158.55 00:31:53.619 =================================================================================================================== 00:31:53.619 Total : 41697.78 162.88 0.00 0.00 12167.91 2129.92 64527.75 00:31:53.619 00:31:53.619 real 0m3.863s 00:31:53.619 user 0m3.589s 00:31:53.619 sys 0m0.244s 00:31:53.619 08:07:38 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:53.619 08:07:38 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:53.619 ************************************ 00:31:53.619 END TEST bdev_write_zeroes 00:31:53.619 ************************************ 00:31:53.619 08:07:38 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:31:53.619 08:07:38 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:53.619 08:07:38 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:53.619 08:07:38 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:53.619 08:07:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:53.619 ************************************ 00:31:53.619 START TEST bdev_json_nonenclosed 00:31:53.619 ************************************ 00:31:53.619 08:07:38 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:53.879 [2024-07-15 08:07:38.387269] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:31:53.879 [2024-07-15 08:07:38.387319] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1823401 ] 00:31:53.879 [2024-07-15 08:07:38.477604] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:53.879 [2024-07-15 08:07:38.552042] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:53.879 [2024-07-15 08:07:38.552098] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:53.879 [2024-07-15 08:07:38.552110] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:53.879 [2024-07-15 08:07:38.552117] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:53.879 00:31:53.879 real 0m0.290s 00:31:53.879 user 0m0.178s 00:31:53.879 sys 0m0.110s 00:31:53.879 08:07:38 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:31:53.879 08:07:38 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:53.879 08:07:38 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:53.879 ************************************ 00:31:53.879 END TEST bdev_json_nonenclosed 00:31:53.879 ************************************ 00:31:54.138 08:07:38 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:31:54.138 08:07:38 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:31:54.138 08:07:38 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:54.138 08:07:38 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:54.138 08:07:38 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:54.138 08:07:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:54.138 ************************************ 00:31:54.138 START TEST bdev_json_nonarray 00:31:54.138 ************************************ 00:31:54.138 08:07:38 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:54.138 [2024-07-15 08:07:38.749126] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:31:54.138 [2024-07-15 08:07:38.749177] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1823443 ] 00:31:54.138 [2024-07-15 08:07:38.838105] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:54.397 [2024-07-15 08:07:38.916146] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:54.398 [2024-07-15 08:07:38.916208] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:54.398 [2024-07-15 08:07:38.916220] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:54.398 [2024-07-15 08:07:38.916227] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:54.398 00:31:54.398 real 0m0.280s 00:31:54.398 user 0m0.170s 00:31:54.398 sys 0m0.107s 00:31:54.398 08:07:38 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:31:54.398 08:07:38 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:54.398 08:07:38 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:54.398 ************************************ 00:31:54.398 END TEST bdev_json_nonarray 00:31:54.398 ************************************ 00:31:54.398 08:07:39 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:31:54.398 08:07:39 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:31:54.398 00:31:54.398 real 1m9.437s 00:31:54.398 user 2m45.096s 00:31:54.398 sys 0m6.577s 00:31:54.398 08:07:39 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:54.398 08:07:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:31:54.398 ************************************ 00:31:54.398 END TEST blockdev_crypto_qat 00:31:54.398 ************************************ 00:31:54.398 08:07:39 -- common/autotest_common.sh@1142 -- # return 0 00:31:54.398 08:07:39 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:31:54.398 08:07:39 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:31:54.398 08:07:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:54.398 08:07:39 -- common/autotest_common.sh@10 -- # set +x 00:31:54.398 ************************************ 00:31:54.398 START TEST chaining 00:31:54.398 ************************************ 00:31:54.398 08:07:39 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:31:54.658 * Looking for test storage... 00:31:54.658 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:54.658 08:07:39 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@7 -- # uname -s 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:54.658 08:07:39 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:54.658 08:07:39 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:54.658 08:07:39 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:54.658 08:07:39 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:54.658 08:07:39 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:54.658 08:07:39 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:54.658 08:07:39 chaining -- paths/export.sh@5 -- # export PATH 00:31:54.658 08:07:39 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@47 -- # : 0 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:54.658 08:07:39 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:31:54.658 08:07:39 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:31:54.658 08:07:39 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:31:54.658 08:07:39 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:31:54.658 08:07:39 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:31:54.658 08:07:39 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:31:54.658 08:07:39 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:31:54.658 08:07:39 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:31:54.658 08:07:39 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:31:54.658 08:07:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@296 -- # e810=() 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@297 -- # x722=() 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@298 -- # mlx=() 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:32:02.796 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:32:02.796 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:32:02.796 Found net devices under 0000:4b:00.0: cvl_0_0 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:32:02.796 Found net devices under 0000:4b:00.1: cvl_0_1 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:32:02.796 08:07:47 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:32:02.797 08:07:47 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:32:03.058 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:32:03.058 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.606 ms 00:32:03.058 00:32:03.058 --- 10.0.0.2 ping statistics --- 00:32:03.058 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:03.058 rtt min/avg/max/mdev = 0.606/0.606/0.606/0.000 ms 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:32:03.058 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:32:03.058 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.274 ms 00:32:03.058 00:32:03.058 --- 10.0.0.1 ping statistics --- 00:32:03.058 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:03.058 rtt min/avg/max/mdev = 0.274/0.274/0.274/0.000 ms 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@422 -- # return 0 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:32:03.058 08:07:47 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:32:03.058 08:07:47 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:32:03.058 08:07:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@481 -- # nvmfpid=1827686 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@482 -- # waitforlisten 1827686 00:32:03.058 08:07:47 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:32:03.058 08:07:47 chaining -- common/autotest_common.sh@829 -- # '[' -z 1827686 ']' 00:32:03.058 08:07:47 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:03.058 08:07:47 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:03.058 08:07:47 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:03.058 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:03.058 08:07:47 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:03.058 08:07:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:03.319 [2024-07-15 08:07:47.853260] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:03.319 [2024-07-15 08:07:47.853320] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:03.319 [2024-07-15 08:07:47.943436] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:03.319 [2024-07-15 08:07:48.044455] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:32:03.319 [2024-07-15 08:07:48.044522] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:32:03.319 [2024-07-15 08:07:48.044531] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:32:03.319 [2024-07-15 08:07:48.044540] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:32:03.319 [2024-07-15 08:07:48.044547] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:32:03.319 [2024-07-15 08:07:48.044576] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:04.263 08:07:48 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:04.263 08:07:48 chaining -- common/autotest_common.sh@862 -- # return 0 00:32:04.263 08:07:48 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:32:04.263 08:07:48 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:32:04.263 08:07:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:04.263 08:07:48 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@69 -- # mktemp 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.AntbyCcD7I 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@69 -- # mktemp 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.YfMQEARoOz 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:32:04.263 08:07:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.263 08:07:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:04.263 malloc0 00:32:04.263 true 00:32:04.263 true 00:32:04.263 [2024-07-15 08:07:48.815131] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:04.263 crypto0 00:32:04.263 [2024-07-15 08:07:48.823156] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:32:04.263 crypto1 00:32:04.263 [2024-07-15 08:07:48.831285] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:32:04.263 [2024-07-15 08:07:48.847506] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:32:04.263 08:07:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@85 -- # update_stats 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:04.263 08:07:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:04.264 08:07:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:04.264 08:07:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:04.264 08:07:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:04.264 08:07:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:04.264 08:07:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:04.526 08:07:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.AntbyCcD7I bs=1K count=64 00:32:04.526 64+0 records in 00:32:04.526 64+0 records out 00:32:04.526 65536 bytes (66 kB, 64 KiB) copied, 0.00102924 s, 63.7 MB/s 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.AntbyCcD7I --ob Nvme0n1 --bs 65536 --count 1 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@25 -- # local config 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:32:04.526 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@31 -- # config='{ 00:32:04.526 "subsystems": [ 00:32:04.526 { 00:32:04.526 "subsystem": "bdev", 00:32:04.526 "config": [ 00:32:04.526 { 00:32:04.526 "method": "bdev_nvme_attach_controller", 00:32:04.526 "params": { 00:32:04.526 "trtype": "tcp", 00:32:04.526 "adrfam": "IPv4", 00:32:04.526 "name": "Nvme0", 00:32:04.526 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:04.526 "traddr": "10.0.0.2", 00:32:04.526 "trsvcid": "4420" 00:32:04.526 } 00:32:04.526 }, 00:32:04.526 { 00:32:04.526 "method": "bdev_set_options", 00:32:04.526 "params": { 00:32:04.526 "bdev_auto_examine": false 00:32:04.526 } 00:32:04.526 } 00:32:04.526 ] 00:32:04.526 } 00:32:04.526 ] 00:32:04.526 }' 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.AntbyCcD7I --ob Nvme0n1 --bs 65536 --count 1 00:32:04.526 08:07:49 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:32:04.526 "subsystems": [ 00:32:04.526 { 00:32:04.526 "subsystem": "bdev", 00:32:04.526 "config": [ 00:32:04.526 { 00:32:04.526 "method": "bdev_nvme_attach_controller", 00:32:04.526 "params": { 00:32:04.526 "trtype": "tcp", 00:32:04.526 "adrfam": "IPv4", 00:32:04.526 "name": "Nvme0", 00:32:04.526 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:04.526 "traddr": "10.0.0.2", 00:32:04.526 "trsvcid": "4420" 00:32:04.526 } 00:32:04.526 }, 00:32:04.526 { 00:32:04.526 "method": "bdev_set_options", 00:32:04.526 "params": { 00:32:04.526 "bdev_auto_examine": false 00:32:04.526 } 00:32:04.526 } 00:32:04.526 ] 00:32:04.526 } 00:32:04.526 ] 00:32:04.526 }' 00:32:04.526 [2024-07-15 08:07:49.174521] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:04.526 [2024-07-15 08:07:49.174599] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1827901 ] 00:32:04.526 [2024-07-15 08:07:49.266113] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:04.787 [2024-07-15 08:07:49.361858] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:05.048  Copying: 64/64 [kB] (average 9142 kBps) 00:32:05.048 00:32:05.048 08:07:49 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:32:05.048 08:07:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:05.048 08:07:49 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:05.048 08:07:49 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:05.048 08:07:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:05.048 08:07:49 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:05.048 08:07:49 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:05.048 08:07:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:05.048 08:07:49 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:05.048 08:07:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@96 -- # update_stats 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:05.309 08:07:49 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:05.309 08:07:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:05.309 08:07:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:05.310 08:07:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:05.310 08:07:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:05.310 08:07:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:05.310 08:07:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:05.570 08:07:50 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:05.571 08:07:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:05.571 08:07:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:05.571 08:07:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:05.571 08:07:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:05.571 08:07:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:05.571 08:07:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.YfMQEARoOz --ib Nvme0n1 --bs 65536 --count 1 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@25 -- # local config 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:32:05.571 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@31 -- # config='{ 00:32:05.571 "subsystems": [ 00:32:05.571 { 00:32:05.571 "subsystem": "bdev", 00:32:05.571 "config": [ 00:32:05.571 { 00:32:05.571 "method": "bdev_nvme_attach_controller", 00:32:05.571 "params": { 00:32:05.571 "trtype": "tcp", 00:32:05.571 "adrfam": "IPv4", 00:32:05.571 "name": "Nvme0", 00:32:05.571 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:05.571 "traddr": "10.0.0.2", 00:32:05.571 "trsvcid": "4420" 00:32:05.571 } 00:32:05.571 }, 00:32:05.571 { 00:32:05.571 "method": "bdev_set_options", 00:32:05.571 "params": { 00:32:05.571 "bdev_auto_examine": false 00:32:05.571 } 00:32:05.571 } 00:32:05.571 ] 00:32:05.571 } 00:32:05.571 ] 00:32:05.571 }' 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.YfMQEARoOz --ib Nvme0n1 --bs 65536 --count 1 00:32:05.571 08:07:50 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:32:05.571 "subsystems": [ 00:32:05.571 { 00:32:05.571 "subsystem": "bdev", 00:32:05.571 "config": [ 00:32:05.571 { 00:32:05.571 "method": "bdev_nvme_attach_controller", 00:32:05.571 "params": { 00:32:05.571 "trtype": "tcp", 00:32:05.571 "adrfam": "IPv4", 00:32:05.571 "name": "Nvme0", 00:32:05.571 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:05.571 "traddr": "10.0.0.2", 00:32:05.571 "trsvcid": "4420" 00:32:05.571 } 00:32:05.571 }, 00:32:05.571 { 00:32:05.571 "method": "bdev_set_options", 00:32:05.571 "params": { 00:32:05.571 "bdev_auto_examine": false 00:32:05.571 } 00:32:05.571 } 00:32:05.571 ] 00:32:05.571 } 00:32:05.571 ] 00:32:05.571 }' 00:32:05.571 [2024-07-15 08:07:50.291063] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:05.571 [2024-07-15 08:07:50.291127] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1828079 ] 00:32:05.832 [2024-07-15 08:07:50.383165] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:05.832 [2024-07-15 08:07:50.477216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:06.354  Copying: 64/64 [kB] (average 15 MBps) 00:32:06.354 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:06.354 08:07:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:06.354 08:07:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:06.354 08:07:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:06.354 08:07:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:06.354 08:07:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:06.354 08:07:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:06.354 08:07:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:06.354 08:07:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:06.354 08:07:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:06.354 08:07:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:06.354 08:07:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:06.354 08:07:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:06.354 08:07:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.AntbyCcD7I /tmp/tmp.YfMQEARoOz 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@25 -- # local config 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:32:06.354 08:07:51 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:32:06.354 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:32:06.615 08:07:51 chaining -- bdev/chaining.sh@31 -- # config='{ 00:32:06.615 "subsystems": [ 00:32:06.615 { 00:32:06.615 "subsystem": "bdev", 00:32:06.615 "config": [ 00:32:06.615 { 00:32:06.615 "method": "bdev_nvme_attach_controller", 00:32:06.616 "params": { 00:32:06.616 "trtype": "tcp", 00:32:06.616 "adrfam": "IPv4", 00:32:06.616 "name": "Nvme0", 00:32:06.616 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:06.616 "traddr": "10.0.0.2", 00:32:06.616 "trsvcid": "4420" 00:32:06.616 } 00:32:06.616 }, 00:32:06.616 { 00:32:06.616 "method": "bdev_set_options", 00:32:06.616 "params": { 00:32:06.616 "bdev_auto_examine": false 00:32:06.616 } 00:32:06.616 } 00:32:06.616 ] 00:32:06.616 } 00:32:06.616 ] 00:32:06.616 }' 00:32:06.616 08:07:51 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:32:06.616 08:07:51 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:32:06.616 "subsystems": [ 00:32:06.616 { 00:32:06.616 "subsystem": "bdev", 00:32:06.616 "config": [ 00:32:06.616 { 00:32:06.616 "method": "bdev_nvme_attach_controller", 00:32:06.616 "params": { 00:32:06.616 "trtype": "tcp", 00:32:06.616 "adrfam": "IPv4", 00:32:06.616 "name": "Nvme0", 00:32:06.616 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:06.616 "traddr": "10.0.0.2", 00:32:06.616 "trsvcid": "4420" 00:32:06.616 } 00:32:06.616 }, 00:32:06.616 { 00:32:06.616 "method": "bdev_set_options", 00:32:06.616 "params": { 00:32:06.616 "bdev_auto_examine": false 00:32:06.616 } 00:32:06.616 } 00:32:06.616 ] 00:32:06.616 } 00:32:06.616 ] 00:32:06.616 }' 00:32:06.616 [2024-07-15 08:07:51.199003] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:06.616 [2024-07-15 08:07:51.199069] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1828316 ] 00:32:06.616 [2024-07-15 08:07:51.290048] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:06.876 [2024-07-15 08:07:51.385077] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:07.136  Copying: 64/64 [kB] (average 12 MBps) 00:32:07.136 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@106 -- # update_stats 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:07.136 08:07:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:07.136 08:07:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:07.136 08:07:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:07.136 08:07:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:07.136 08:07:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:07.136 08:07:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:07.136 08:07:51 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:07.396 08:07:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:07.396 08:07:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:07.396 08:07:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:07.396 08:07:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:07.396 08:07:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:07.396 08:07:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.AntbyCcD7I --ob Nvme0n1 --bs 4096 --count 16 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@25 -- # local config 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:32:07.396 08:07:51 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:32:07.396 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:32:07.396 08:07:52 chaining -- bdev/chaining.sh@31 -- # config='{ 00:32:07.396 "subsystems": [ 00:32:07.396 { 00:32:07.396 "subsystem": "bdev", 00:32:07.396 "config": [ 00:32:07.396 { 00:32:07.396 "method": "bdev_nvme_attach_controller", 00:32:07.396 "params": { 00:32:07.396 "trtype": "tcp", 00:32:07.396 "adrfam": "IPv4", 00:32:07.396 "name": "Nvme0", 00:32:07.396 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:07.396 "traddr": "10.0.0.2", 00:32:07.396 "trsvcid": "4420" 00:32:07.396 } 00:32:07.396 }, 00:32:07.396 { 00:32:07.396 "method": "bdev_set_options", 00:32:07.396 "params": { 00:32:07.396 "bdev_auto_examine": false 00:32:07.396 } 00:32:07.396 } 00:32:07.396 ] 00:32:07.396 } 00:32:07.396 ] 00:32:07.396 }' 00:32:07.396 08:07:52 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.AntbyCcD7I --ob Nvme0n1 --bs 4096 --count 16 00:32:07.396 08:07:52 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:32:07.396 "subsystems": [ 00:32:07.396 { 00:32:07.396 "subsystem": "bdev", 00:32:07.396 "config": [ 00:32:07.396 { 00:32:07.396 "method": "bdev_nvme_attach_controller", 00:32:07.396 "params": { 00:32:07.396 "trtype": "tcp", 00:32:07.396 "adrfam": "IPv4", 00:32:07.396 "name": "Nvme0", 00:32:07.396 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:07.396 "traddr": "10.0.0.2", 00:32:07.396 "trsvcid": "4420" 00:32:07.396 } 00:32:07.396 }, 00:32:07.396 { 00:32:07.396 "method": "bdev_set_options", 00:32:07.396 "params": { 00:32:07.396 "bdev_auto_examine": false 00:32:07.396 } 00:32:07.396 } 00:32:07.396 ] 00:32:07.396 } 00:32:07.396 ] 00:32:07.396 }' 00:32:07.396 [2024-07-15 08:07:52.085911] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:07.396 [2024-07-15 08:07:52.085975] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1828436 ] 00:32:07.656 [2024-07-15 08:07:52.175282] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:07.656 [2024-07-15 08:07:52.268197] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:08.213  Copying: 64/64 [kB] (average 6400 kBps) 00:32:08.213 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@114 -- # update_stats 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:08.213 08:07:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:08.213 08:07:52 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.474 08:07:52 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:08.474 08:07:53 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:32:08.474 08:07:53 chaining -- bdev/chaining.sh@117 -- # : 00:32:08.475 08:07:53 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.YfMQEARoOz --ib Nvme0n1 --bs 4096 --count 16 00:32:08.475 08:07:53 chaining -- bdev/chaining.sh@25 -- # local config 00:32:08.475 08:07:53 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:32:08.475 08:07:53 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:32:08.475 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:32:08.475 08:07:53 chaining -- bdev/chaining.sh@31 -- # config='{ 00:32:08.475 "subsystems": [ 00:32:08.475 { 00:32:08.475 "subsystem": "bdev", 00:32:08.475 "config": [ 00:32:08.475 { 00:32:08.475 "method": "bdev_nvme_attach_controller", 00:32:08.475 "params": { 00:32:08.475 "trtype": "tcp", 00:32:08.475 "adrfam": "IPv4", 00:32:08.475 "name": "Nvme0", 00:32:08.475 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:08.475 "traddr": "10.0.0.2", 00:32:08.475 "trsvcid": "4420" 00:32:08.475 } 00:32:08.475 }, 00:32:08.475 { 00:32:08.475 "method": "bdev_set_options", 00:32:08.475 "params": { 00:32:08.475 "bdev_auto_examine": false 00:32:08.475 } 00:32:08.475 } 00:32:08.475 ] 00:32:08.475 } 00:32:08.475 ] 00:32:08.475 }' 00:32:08.475 08:07:53 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.YfMQEARoOz --ib Nvme0n1 --bs 4096 --count 16 00:32:08.475 08:07:53 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:32:08.475 "subsystems": [ 00:32:08.475 { 00:32:08.475 "subsystem": "bdev", 00:32:08.475 "config": [ 00:32:08.475 { 00:32:08.475 "method": "bdev_nvme_attach_controller", 00:32:08.475 "params": { 00:32:08.475 "trtype": "tcp", 00:32:08.475 "adrfam": "IPv4", 00:32:08.475 "name": "Nvme0", 00:32:08.475 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:32:08.475 "traddr": "10.0.0.2", 00:32:08.475 "trsvcid": "4420" 00:32:08.475 } 00:32:08.475 }, 00:32:08.475 { 00:32:08.475 "method": "bdev_set_options", 00:32:08.475 "params": { 00:32:08.475 "bdev_auto_examine": false 00:32:08.475 } 00:32:08.475 } 00:32:08.475 ] 00:32:08.475 } 00:32:08.475 ] 00:32:08.475 }' 00:32:08.735 [2024-07-15 08:07:53.274371] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:08.735 [2024-07-15 08:07:53.274438] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1828744 ] 00:32:08.735 [2024-07-15 08:07:53.364993] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:08.735 [2024-07-15 08:07:53.458620] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:09.305  Copying: 64/64 [kB] (average 488 kBps) 00:32:09.305 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.AntbyCcD7I /tmp/tmp.YfMQEARoOz 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.AntbyCcD7I /tmp/tmp.YfMQEARoOz 00:32:09.566 08:07:54 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@117 -- # sync 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@120 -- # set +e 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:32:09.566 rmmod nvme_tcp 00:32:09.566 rmmod nvme_fabrics 00:32:09.566 rmmod nvme_keyring 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@124 -- # set -e 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@125 -- # return 0 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@489 -- # '[' -n 1827686 ']' 00:32:09.566 08:07:54 chaining -- nvmf/common.sh@490 -- # killprocess 1827686 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@948 -- # '[' -z 1827686 ']' 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@952 -- # kill -0 1827686 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@953 -- # uname 00:32:09.566 08:07:54 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:09.827 08:07:54 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1827686 00:32:09.827 08:07:54 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:09.827 08:07:54 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:09.827 08:07:54 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1827686' 00:32:09.827 killing process with pid 1827686 00:32:09.827 08:07:54 chaining -- common/autotest_common.sh@967 -- # kill 1827686 00:32:09.827 08:07:54 chaining -- common/autotest_common.sh@972 -- # wait 1827686 00:32:09.827 08:07:54 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:32:09.827 08:07:54 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:32:09.827 08:07:54 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:32:09.827 08:07:54 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:32:09.827 08:07:54 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:32:09.827 08:07:54 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:09.827 08:07:54 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:09.827 08:07:54 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:12.370 08:07:56 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:32:12.370 08:07:56 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:32:12.370 08:07:56 chaining -- bdev/chaining.sh@132 -- # bperfpid=1829222 00:32:12.370 08:07:56 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1829222 00:32:12.370 08:07:56 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:32:12.370 08:07:56 chaining -- common/autotest_common.sh@829 -- # '[' -z 1829222 ']' 00:32:12.370 08:07:56 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:12.370 08:07:56 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:12.370 08:07:56 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:12.370 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:12.370 08:07:56 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:12.370 08:07:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:12.370 [2024-07-15 08:07:56.683678] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:12.370 [2024-07-15 08:07:56.683745] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1829222 ] 00:32:12.370 [2024-07-15 08:07:56.774954] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:12.370 [2024-07-15 08:07:56.868915] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:12.942 08:07:57 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:12.942 08:07:57 chaining -- common/autotest_common.sh@862 -- # return 0 00:32:12.942 08:07:57 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:32:12.942 08:07:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:12.942 08:07:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:12.942 malloc0 00:32:12.942 true 00:32:12.942 true 00:32:12.942 [2024-07-15 08:07:57.673757] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:12.942 crypto0 00:32:12.942 [2024-07-15 08:07:57.681780] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:32:12.942 crypto1 00:32:12.942 08:07:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:12.942 08:07:57 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:13.203 Running I/O for 5 seconds... 00:32:18.492 00:32:18.492 Latency(us) 00:32:18.492 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:18.492 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:32:18.492 Verification LBA range: start 0x0 length 0x2000 00:32:18.492 crypto1 : 5.01 14291.04 55.82 0.00 0.00 17864.74 4511.90 11746.07 00:32:18.492 =================================================================================================================== 00:32:18.492 Total : 14291.04 55.82 0.00 0.00 17864.74 4511.90 11746.07 00:32:18.492 0 00:32:18.492 08:08:02 chaining -- bdev/chaining.sh@146 -- # killprocess 1829222 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@948 -- # '[' -z 1829222 ']' 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@952 -- # kill -0 1829222 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@953 -- # uname 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1829222 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1829222' 00:32:18.492 killing process with pid 1829222 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@967 -- # kill 1829222 00:32:18.492 Received shutdown signal, test time was about 5.000000 seconds 00:32:18.492 00:32:18.492 Latency(us) 00:32:18.492 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:18.492 =================================================================================================================== 00:32:18.492 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:18.492 08:08:02 chaining -- common/autotest_common.sh@972 -- # wait 1829222 00:32:18.492 08:08:03 chaining -- bdev/chaining.sh@152 -- # bperfpid=1830356 00:32:18.492 08:08:03 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1830356 00:32:18.492 08:08:03 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:32:18.492 08:08:03 chaining -- common/autotest_common.sh@829 -- # '[' -z 1830356 ']' 00:32:18.492 08:08:03 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:18.492 08:08:03 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:18.492 08:08:03 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:18.492 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:18.492 08:08:03 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:18.492 08:08:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:18.492 [2024-07-15 08:08:03.072792] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:18.493 [2024-07-15 08:08:03.072843] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1830356 ] 00:32:18.493 [2024-07-15 08:08:03.159679] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:18.493 [2024-07-15 08:08:03.221758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:19.431 08:08:03 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:19.431 08:08:03 chaining -- common/autotest_common.sh@862 -- # return 0 00:32:19.431 08:08:03 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:32:19.431 08:08:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:19.431 08:08:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:19.431 malloc0 00:32:19.431 true 00:32:19.431 true 00:32:19.431 [2024-07-15 08:08:03.997968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:32:19.431 [2024-07-15 08:08:03.997999] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:32:19.431 [2024-07-15 08:08:03.998011] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2330ae0 00:32:19.431 [2024-07-15 08:08:03.998018] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:32:19.431 [2024-07-15 08:08:03.998885] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:32:19.431 [2024-07-15 08:08:03.998902] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:32:19.431 pt0 00:32:19.431 [2024-07-15 08:08:04.005996] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:19.431 crypto0 00:32:19.431 [2024-07-15 08:08:04.014015] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:32:19.431 crypto1 00:32:19.431 08:08:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:19.431 08:08:04 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:19.431 Running I/O for 5 seconds... 00:32:24.709 00:32:24.709 Latency(us) 00:32:24.709 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:24.709 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:32:24.709 Verification LBA range: start 0x0 length 0x2000 00:32:24.709 crypto1 : 5.01 11240.74 43.91 0.00 0.00 22719.86 5192.47 13611.32 00:32:24.709 =================================================================================================================== 00:32:24.709 Total : 11240.74 43.91 0.00 0.00 22719.86 5192.47 13611.32 00:32:24.709 0 00:32:24.709 08:08:09 chaining -- bdev/chaining.sh@167 -- # killprocess 1830356 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@948 -- # '[' -z 1830356 ']' 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@952 -- # kill -0 1830356 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@953 -- # uname 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1830356 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1830356' 00:32:24.709 killing process with pid 1830356 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@967 -- # kill 1830356 00:32:24.709 Received shutdown signal, test time was about 5.000000 seconds 00:32:24.709 00:32:24.709 Latency(us) 00:32:24.709 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:24.709 =================================================================================================================== 00:32:24.709 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@972 -- # wait 1830356 00:32:24.709 08:08:09 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:32:24.709 08:08:09 chaining -- bdev/chaining.sh@170 -- # killprocess 1830356 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@948 -- # '[' -z 1830356 ']' 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@952 -- # kill -0 1830356 00:32:24.709 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (1830356) - No such process 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 1830356 is not found' 00:32:24.709 Process with pid 1830356 is not found 00:32:24.709 08:08:09 chaining -- bdev/chaining.sh@171 -- # wait 1830356 00:32:24.709 08:08:09 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:32:24.709 08:08:09 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:32:24.709 08:08:09 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:32:24.709 08:08:09 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:32:24.709 08:08:09 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:32:24.709 08:08:09 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:32:24.709 08:08:09 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:24.709 08:08:09 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:24.710 08:08:09 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:32:24.710 08:08:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@296 -- # e810=() 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@297 -- # x722=() 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@298 -- # mlx=() 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:32:24.710 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:32:24.710 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:32:24.710 Found net devices under 0000:4b:00.0: cvl_0_0 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:32:24.710 Found net devices under 0000:4b:00.1: cvl_0_1 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:32:24.710 08:08:09 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:32:24.971 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:32:24.971 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.623 ms 00:32:24.971 00:32:24.971 --- 10.0.0.2 ping statistics --- 00:32:24.971 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:24.971 rtt min/avg/max/mdev = 0.623/0.623/0.623/0.000 ms 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:32:24.971 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:32:24.971 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.268 ms 00:32:24.971 00:32:24.971 --- 10.0.0.1 ping statistics --- 00:32:24.971 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:32:24.971 rtt min/avg/max/mdev = 0.268/0.268/0.268/0.000 ms 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@422 -- # return 0 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:32:24.971 08:08:09 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:32:24.971 08:08:09 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:32:24.971 08:08:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@481 -- # nvmfpid=1831331 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@482 -- # waitforlisten 1831331 00:32:24.971 08:08:09 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:32:24.971 08:08:09 chaining -- common/autotest_common.sh@829 -- # '[' -z 1831331 ']' 00:32:24.971 08:08:09 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:24.971 08:08:09 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:24.971 08:08:09 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:24.971 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:24.971 08:08:09 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:24.971 08:08:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:25.232 [2024-07-15 08:08:09.773867] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:25.232 [2024-07-15 08:08:09.773956] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:25.232 [2024-07-15 08:08:09.863190] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:25.232 [2024-07-15 08:08:09.962327] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:32:25.232 [2024-07-15 08:08:09.962391] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:32:25.232 [2024-07-15 08:08:09.962401] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:32:25.232 [2024-07-15 08:08:09.962409] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:32:25.232 [2024-07-15 08:08:09.962416] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:32:25.232 [2024-07-15 08:08:09.962450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@862 -- # return 0 00:32:26.177 08:08:10 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:26.177 08:08:10 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:32:26.177 08:08:10 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:26.177 malloc0 00:32:26.177 [2024-07-15 08:08:10.682048] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:32:26.177 [2024-07-15 08:08:10.698251] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:26.177 08:08:10 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:32:26.177 08:08:10 chaining -- bdev/chaining.sh@189 -- # bperfpid=1831628 00:32:26.177 08:08:10 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1831628 /var/tmp/bperf.sock 00:32:26.177 08:08:10 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@829 -- # '[' -z 1831628 ']' 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:32:26.177 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:26.177 08:08:10 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:26.177 [2024-07-15 08:08:10.777371] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:26.177 [2024-07-15 08:08:10.777430] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1831628 ] 00:32:26.177 [2024-07-15 08:08:10.867349] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:26.438 [2024-07-15 08:08:10.944626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:27.009 08:08:11 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:27.009 08:08:11 chaining -- common/autotest_common.sh@862 -- # return 0 00:32:27.009 08:08:11 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:32:27.009 08:08:11 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:32:27.270 [2024-07-15 08:08:11.942998] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:27.270 nvme0n1 00:32:27.270 true 00:32:27.270 crypto0 00:32:27.270 08:08:11 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:32:27.530 Running I/O for 5 seconds... 00:32:32.859 00:32:32.859 Latency(us) 00:32:32.859 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:32.859 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:32:32.859 Verification LBA range: start 0x0 length 0x2000 00:32:32.859 crypto0 : 5.02 8856.53 34.60 0.00 0.00 28817.20 2760.07 23088.84 00:32:32.859 =================================================================================================================== 00:32:32.859 Total : 8856.53 34.60 0.00 0.00 28817.20 2760.07 23088.84 00:32:32.859 0 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@205 -- # sequence=88938 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@206 -- # encrypt=44469 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:32.859 08:08:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@207 -- # decrypt=44469 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:32:33.147 08:08:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:33.408 08:08:17 chaining -- bdev/chaining.sh@208 -- # crc32c=88938 00:32:33.408 08:08:17 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:32:33.408 08:08:17 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:32:33.408 08:08:17 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:32:33.408 08:08:17 chaining -- bdev/chaining.sh@214 -- # killprocess 1831628 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@948 -- # '[' -z 1831628 ']' 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@952 -- # kill -0 1831628 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@953 -- # uname 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1831628 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1831628' 00:32:33.408 killing process with pid 1831628 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@967 -- # kill 1831628 00:32:33.408 Received shutdown signal, test time was about 5.000000 seconds 00:32:33.408 00:32:33.408 Latency(us) 00:32:33.408 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:33.408 =================================================================================================================== 00:32:33.408 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:33.408 08:08:17 chaining -- common/autotest_common.sh@972 -- # wait 1831628 00:32:33.408 08:08:18 chaining -- bdev/chaining.sh@219 -- # bperfpid=1832862 00:32:33.408 08:08:18 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1832862 /var/tmp/bperf.sock 00:32:33.408 08:08:18 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:32:33.408 08:08:18 chaining -- common/autotest_common.sh@829 -- # '[' -z 1832862 ']' 00:32:33.408 08:08:18 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:32:33.408 08:08:18 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:33.408 08:08:18 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:32:33.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:32:33.408 08:08:18 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:33.408 08:08:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:33.408 [2024-07-15 08:08:18.155029] Starting SPDK v24.09-pre git sha1 897e912d5 / DPDK 24.03.0 initialization... 00:32:33.408 [2024-07-15 08:08:18.155081] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1832862 ] 00:32:33.669 [2024-07-15 08:08:18.243068] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:33.669 [2024-07-15 08:08:18.314943] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:34.610 08:08:19 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:34.610 08:08:19 chaining -- common/autotest_common.sh@862 -- # return 0 00:32:34.610 08:08:19 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:32:34.610 08:08:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:32:34.610 [2024-07-15 08:08:19.332669] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:32:34.610 nvme0n1 00:32:34.610 true 00:32:34.610 crypto0 00:32:34.871 08:08:19 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:32:34.871 Running I/O for 5 seconds... 00:32:40.154 00:32:40.154 Latency(us) 00:32:40.154 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:40.154 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:32:40.154 Verification LBA range: start 0x0 length 0x200 00:32:40.154 crypto0 : 5.01 2260.32 141.27 0.00 0.00 13863.72 1348.53 19055.85 00:32:40.154 =================================================================================================================== 00:32:40.154 Total : 2260.32 141.27 0.00 0.00 13863.72 1348.53 19055.85 00:32:40.154 0 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@39 -- # opcode= 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@233 -- # sequence=22638 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:32:40.154 08:08:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@234 -- # encrypt=11319 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:32:40.414 08:08:24 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@235 -- # decrypt=11319 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:32:40.414 08:08:25 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:32:40.673 08:08:25 chaining -- bdev/chaining.sh@236 -- # crc32c=22638 00:32:40.673 08:08:25 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:32:40.673 08:08:25 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:32:40.673 08:08:25 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:32:40.673 08:08:25 chaining -- bdev/chaining.sh@242 -- # killprocess 1832862 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@948 -- # '[' -z 1832862 ']' 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@952 -- # kill -0 1832862 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@953 -- # uname 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1832862 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1832862' 00:32:40.673 killing process with pid 1832862 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@967 -- # kill 1832862 00:32:40.673 Received shutdown signal, test time was about 5.000000 seconds 00:32:40.673 00:32:40.673 Latency(us) 00:32:40.673 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:40.673 =================================================================================================================== 00:32:40.673 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:40.673 08:08:25 chaining -- common/autotest_common.sh@972 -- # wait 1832862 00:32:40.933 08:08:25 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@117 -- # sync 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@120 -- # set +e 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:32:40.933 rmmod nvme_tcp 00:32:40.933 rmmod nvme_fabrics 00:32:40.933 rmmod nvme_keyring 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@124 -- # set -e 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@125 -- # return 0 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@489 -- # '[' -n 1831331 ']' 00:32:40.933 08:08:25 chaining -- nvmf/common.sh@490 -- # killprocess 1831331 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@948 -- # '[' -z 1831331 ']' 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@952 -- # kill -0 1831331 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@953 -- # uname 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1831331 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1831331' 00:32:40.933 killing process with pid 1831331 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@967 -- # kill 1831331 00:32:40.933 08:08:25 chaining -- common/autotest_common.sh@972 -- # wait 1831331 00:32:41.192 08:08:25 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:32:41.192 08:08:25 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:32:41.192 08:08:25 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:32:41.192 08:08:25 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:32:41.192 08:08:25 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:32:41.192 08:08:25 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:32:41.192 08:08:25 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:32:41.192 08:08:25 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:32:43.730 08:08:27 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:32:43.730 08:08:27 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:32:43.730 00:32:43.730 real 0m48.757s 00:32:43.730 user 0m58.784s 00:32:43.730 sys 0m11.464s 00:32:43.730 08:08:27 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:43.730 08:08:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:32:43.730 ************************************ 00:32:43.730 END TEST chaining 00:32:43.730 ************************************ 00:32:43.730 08:08:27 -- common/autotest_common.sh@1142 -- # return 0 00:32:43.730 08:08:27 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:32:43.730 08:08:27 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:32:43.730 08:08:27 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:32:43.730 08:08:27 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:32:43.730 08:08:27 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:32:43.730 08:08:27 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:32:43.730 08:08:27 -- common/autotest_common.sh@722 -- # xtrace_disable 00:32:43.730 08:08:27 -- common/autotest_common.sh@10 -- # set +x 00:32:43.730 08:08:27 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:32:43.730 08:08:27 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:32:43.730 08:08:27 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:32:43.730 08:08:27 -- common/autotest_common.sh@10 -- # set +x 00:32:50.322 INFO: APP EXITING 00:32:50.322 INFO: killing all VMs 00:32:50.322 INFO: killing vhost app 00:32:50.322 INFO: EXIT DONE 00:32:54.525 Waiting for block devices as requested 00:32:54.525 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:32:54.525 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:32:54.525 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:32:54.525 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:32:54.525 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:32:54.525 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:32:54.525 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:32:54.525 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:32:54.525 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:32:54.786 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:32:54.786 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:32:55.047 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:32:55.047 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:32:55.047 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:32:55.047 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:32:55.307 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:32:55.307 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:32:59.577 Cleaning 00:32:59.577 Removing: /var/run/dpdk/spdk0/config 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:32:59.577 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:32:59.577 Removing: /var/run/dpdk/spdk0/hugepage_info 00:32:59.577 Removing: /dev/shm/nvmf_trace.0 00:32:59.577 Removing: /dev/shm/spdk_tgt_trace.pid1539358 00:32:59.837 Removing: /var/run/dpdk/spdk0 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1534675 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1537077 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1539358 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1539899 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1541262 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1541549 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1542513 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1542811 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1542936 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1546380 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1548192 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1548476 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1548813 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1549190 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1549544 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1549855 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1549925 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1550254 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1551254 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1554516 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1554832 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1555015 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1555252 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1555342 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1555625 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1555794 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1556000 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1556302 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1556623 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1556892 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1557010 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1557304 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1557622 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1557945 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1558109 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1558311 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1558620 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1558935 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1559191 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1559313 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1559614 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1559938 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1560261 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1560388 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1560629 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1560935 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1561259 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1561583 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1561913 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1562235 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1562563 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1562887 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1563216 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1563284 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1563660 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1564223 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1564447 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1564742 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1568751 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1570953 00:32:59.837 Removing: /var/run/dpdk/spdk_pid1573185 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1574129 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1576112 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1576443 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1576465 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1576493 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1581202 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1581825 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1583043 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1583375 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1589256 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1591004 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1591967 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1596611 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1598363 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1599409 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1603940 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1606535 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1607549 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1618884 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1621192 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1622234 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1633098 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1635676 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1637346 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1649014 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1652528 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1653549 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1665069 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1667719 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1668882 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1681807 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1684395 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1685695 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1697547 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1701597 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1702767 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1703910 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1707389 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1713924 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1716636 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1722236 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1726472 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1732906 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1736050 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1743586 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1746596 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1753609 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1756065 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1762823 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1765328 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1769931 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1770389 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1770708 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1771233 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1771713 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1772606 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1773450 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1773860 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1776127 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1778666 00:33:00.097 Removing: /var/run/dpdk/spdk_pid1780626 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1782505 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1784445 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1786532 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1788689 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1790394 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1791084 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1791549 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1793937 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1796235 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1798572 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1800005 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1801314 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1802020 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1802203 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1802305 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1802617 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1802652 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1803933 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1805777 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1807867 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1809248 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1810175 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1810474 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1810539 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1810721 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1811794 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1812419 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1813036 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1815416 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1817719 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1820009 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1821252 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1822777 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1823401 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1823443 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1827901 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1828079 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1828316 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1828436 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1828744 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1829222 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1830356 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1831628 00:33:00.357 Removing: /var/run/dpdk/spdk_pid1832862 00:33:00.357 Clean 00:33:00.617 08:08:45 -- common/autotest_common.sh@1451 -- # return 0 00:33:00.617 08:08:45 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:33:00.617 08:08:45 -- common/autotest_common.sh@728 -- # xtrace_disable 00:33:00.617 08:08:45 -- common/autotest_common.sh@10 -- # set +x 00:33:00.617 08:08:45 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:33:00.617 08:08:45 -- common/autotest_common.sh@728 -- # xtrace_disable 00:33:00.617 08:08:45 -- common/autotest_common.sh@10 -- # set +x 00:33:00.617 08:08:45 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:33:00.617 08:08:45 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:33:00.617 08:08:45 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:33:00.617 08:08:45 -- spdk/autotest.sh@391 -- # hash lcov 00:33:00.617 08:08:45 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:33:00.617 08:08:45 -- spdk/autotest.sh@393 -- # hostname 00:33:00.617 08:08:45 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-CYP-06 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:33:00.877 geninfo: WARNING: invalid characters removed from testname! 00:33:22.838 08:09:07 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:25.377 08:09:09 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:27.913 08:09:12 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:29.821 08:09:14 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:31.726 08:09:16 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:33.632 08:09:18 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:33:35.589 08:09:20 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:33:35.849 08:09:20 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:33:35.849 08:09:20 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:33:35.849 08:09:20 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:33:35.849 08:09:20 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:33:35.849 08:09:20 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:35.849 08:09:20 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:35.849 08:09:20 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:35.849 08:09:20 -- paths/export.sh@5 -- $ export PATH 00:33:35.849 08:09:20 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:35.849 08:09:20 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:35.849 08:09:20 -- common/autobuild_common.sh@444 -- $ date +%s 00:33:35.849 08:09:20 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721023760.XXXXXX 00:33:35.849 08:09:20 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721023760.8eJW7n 00:33:35.849 08:09:20 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:33:35.849 08:09:20 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:33:35.849 08:09:20 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:33:35.849 08:09:20 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:33:35.849 08:09:20 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:33:35.849 08:09:20 -- common/autobuild_common.sh@460 -- $ get_config_params 00:33:35.850 08:09:20 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:33:35.850 08:09:20 -- common/autotest_common.sh@10 -- $ set +x 00:33:35.850 08:09:20 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:33:35.850 08:09:20 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:33:35.850 08:09:20 -- pm/common@17 -- $ local monitor 00:33:35.850 08:09:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:35.850 08:09:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:35.850 08:09:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:35.850 08:09:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:35.850 08:09:20 -- pm/common@25 -- $ sleep 1 00:33:35.850 08:09:20 -- pm/common@21 -- $ date +%s 00:33:35.850 08:09:20 -- pm/common@21 -- $ date +%s 00:33:35.850 08:09:20 -- pm/common@21 -- $ date +%s 00:33:35.850 08:09:20 -- pm/common@21 -- $ date +%s 00:33:35.850 08:09:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721023760 00:33:35.850 08:09:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721023760 00:33:35.850 08:09:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721023760 00:33:35.850 08:09:20 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721023760 00:33:35.850 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721023760_collect-vmstat.pm.log 00:33:35.850 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721023760_collect-cpu-load.pm.log 00:33:35.850 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721023760_collect-cpu-temp.pm.log 00:33:35.850 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721023760_collect-bmc-pm.bmc.pm.log 00:33:36.790 08:09:21 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:33:36.790 08:09:21 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j128 00:33:36.790 08:09:21 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:36.790 08:09:21 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:33:36.790 08:09:21 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:33:36.790 08:09:21 -- spdk/autopackage.sh@19 -- $ timing_finish 00:33:36.790 08:09:21 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:33:36.790 08:09:21 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:33:36.790 08:09:21 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:33:36.790 08:09:21 -- spdk/autopackage.sh@20 -- $ exit 0 00:33:36.790 08:09:21 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:33:36.790 08:09:21 -- pm/common@29 -- $ signal_monitor_resources TERM 00:33:36.790 08:09:21 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:33:36.790 08:09:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:36.790 08:09:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:33:36.790 08:09:21 -- pm/common@44 -- $ pid=1846358 00:33:36.790 08:09:21 -- pm/common@50 -- $ kill -TERM 1846358 00:33:36.790 08:09:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:36.790 08:09:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:33:36.790 08:09:21 -- pm/common@44 -- $ pid=1846359 00:33:36.790 08:09:21 -- pm/common@50 -- $ kill -TERM 1846359 00:33:36.790 08:09:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:36.790 08:09:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:33:36.790 08:09:21 -- pm/common@44 -- $ pid=1846360 00:33:36.790 08:09:21 -- pm/common@50 -- $ kill -TERM 1846360 00:33:36.790 08:09:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:33:36.790 08:09:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:33:36.790 08:09:21 -- pm/common@44 -- $ pid=1846383 00:33:36.790 08:09:21 -- pm/common@50 -- $ sudo -E kill -TERM 1846383 00:33:36.790 + [[ -n 1407472 ]] 00:33:36.790 + sudo kill 1407472 00:33:37.061 [Pipeline] } 00:33:37.082 [Pipeline] // stage 00:33:37.089 [Pipeline] } 00:33:37.108 [Pipeline] // timeout 00:33:37.114 [Pipeline] } 00:33:37.132 [Pipeline] // catchError 00:33:37.138 [Pipeline] } 00:33:37.157 [Pipeline] // wrap 00:33:37.163 [Pipeline] } 00:33:37.180 [Pipeline] // catchError 00:33:37.189 [Pipeline] stage 00:33:37.192 [Pipeline] { (Epilogue) 00:33:37.207 [Pipeline] catchError 00:33:37.209 [Pipeline] { 00:33:37.224 [Pipeline] echo 00:33:37.226 Cleanup processes 00:33:37.232 [Pipeline] sh 00:33:37.524 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:37.524 1846464 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:33:37.524 1846865 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:37.541 [Pipeline] sh 00:33:37.829 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:37.829 ++ grep -v 'sudo pgrep' 00:33:37.829 ++ awk '{print $1}' 00:33:37.829 + sudo kill -9 1846464 00:33:37.841 [Pipeline] sh 00:33:38.128 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:33:50.361 [Pipeline] sh 00:33:50.652 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:33:50.652 Artifacts sizes are good 00:33:50.672 [Pipeline] archiveArtifacts 00:33:50.681 Archiving artifacts 00:33:50.865 [Pipeline] sh 00:33:51.156 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:33:51.174 [Pipeline] cleanWs 00:33:51.186 [WS-CLEANUP] Deleting project workspace... 00:33:51.186 [WS-CLEANUP] Deferred wipeout is used... 00:33:51.194 [WS-CLEANUP] done 00:33:51.198 [Pipeline] } 00:33:51.218 [Pipeline] // catchError 00:33:51.231 [Pipeline] sh 00:33:51.518 + logger -p user.info -t JENKINS-CI 00:33:51.527 [Pipeline] } 00:33:51.544 [Pipeline] // stage 00:33:51.550 [Pipeline] } 00:33:51.569 [Pipeline] // node 00:33:51.576 [Pipeline] End of Pipeline 00:33:51.613 Finished: SUCCESS